DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/03/2026 has been entered.
Response to Amendment
This is in response to applicant’s amendment/response filed on 12/11/2025, which has been entered and made of record. Claims 1 and 7 has been amended. Claims 1-5, 7-10 and 21-25 are pending in the application.
Response to Arguments
Applicant's arguments filed on 12/11/2025 have been fully considered but they are not persuasive. Applicant submitted new amended claims. Accordingly, new grounds of rejection are set forth above. The new grounds of rejection conclusion have been necessitated by Applicant's amendments to the claims.
Applicants state that “Rothberg et al. does not disclose or even suggest displaying both an instruction indicator and an orientation indicator as recited in amended independent claim 7” and “Because Rothberg does not teach all of the features of amended independent claim 7, the §102 Rejection of independent claim 7, and its dependent claims 8-10, should be withdrawn”.
The examiner disagrees. Rothberg et al. teach display, in the operator video displayed on the instructor processing device, an instruction indicator based on the selected instruction (Figs 3A-3B, par 0194-0195, “FIG. 3A shows an example coarse instruction 302 that may be provided to an operator via a display 306 on a computing device 304. The coarse instruction 302 may be provided when the ultrasound device is positioned outside of a predetermined area on the subject. As shown, the coarse instruction 302 includes an indication of where the operator should position the ultrasound device on the subject to be within the predetermined area. In particular, the coarse instruction 302 comprises a symbol 308 (e.g., a star) showing where the predetermined region is located on a graphical image of the subject 301. The coarse instruction 302 also includes a message 310 with an arrow pointing to the symbol 308 instructing the operator to “POSITION ULTRASOUND DEVICE HERE” to communicate to the operator that the ultrasound device should be placed where the symbol 308 is located on the graphical image of the subject 301. ….the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device. The symbol 314 may be animated in some implementations. For example, the symbol 314 (e.g., an arrow and/or model of the ultrasound device) may move in a direction in which the ultrasound device is to be moved. The fine instruction 312 may also comprise a message 316 that compliments the symbol 314 such as the message “TURN CLOCKWISE.” The symbol 314 and/or the message 316 may be overlaid onto a background image 311.”).
Because prior art teach all the limitation of claim 7 (include amendment part), so the rejection of claim 7 would be maintained. Same reason for dependent claims 8-10.
Applicants state that “The combination of Rothberg et al. and Pelisser et al. does not disclose or even suggest displaying both an instruction indicator and an orientation indicator as recited in amended independent claim 1.” and “the § 103 Rejection of amended independent claim 1 should be withdrawn. By their dependency on independent claim 1, the § 103 Rejections of claims 2-5 and 21-25 should also be withdrawn”.
The examiner disagrees. Rothberg et al. teach display, in the operator video displayed on the instructor processing device, an instruction indicator based on the selected instruction (Figs 3A-3B, par 0194-0195, “FIG. 3A shows an example coarse instruction 302 that may be provided to an operator via a display 306 on a computing device 304. The coarse instruction 302 may be provided when the ultrasound device is positioned outside of a predetermined area on the subject. As shown, the coarse instruction 302 includes an indication of where the operator should position the ultrasound device on the subject to be within the predetermined area. In particular, the coarse instruction 302 comprises a symbol 308 (e.g., a star) showing where the predetermined region is located on a graphical image of the subject 301. The coarse instruction 302 also includes a message 310 with an arrow pointing to the symbol 308 instructing the operator to “POSITION ULTRASOUND DEVICE HERE” to communicate to the operator that the ultrasound device should be placed where the symbol 308 is located on the graphical image of the subject 301. ….the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device. The symbol 314 may be animated in some implementations. For example, the symbol 314 (e.g., an arrow and/or model of the ultrasound device) may move in a direction in which the ultrasound device is to be moved. The fine instruction 312 may also comprise a message 316 that compliments the symbol 314 such as the message “TURN CLOCKWISE.” The symbol 314 and/or the message 316 may be overlaid onto a background image 311.”).
Because prior art teach all the limitation of claim 1 (include amendment part), so the rejection of claim 1 would be maintained. Same reason for dependent claims 2-5 and 21-25.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 7-10 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. PGPubs 2017/0360412 to Rothberg et al..
Regarding Claim 7, Rothberg et al. teach an ultrasound system (ultrasound system 100 in Fig. 1; the system and its various elements are numbered differently in later Figures, but is generally compatible throughout the Figures as described in par 0341, par 0346), comprising: an instructor processing device configured to (par 0144, 'App'; par 0005, 'The App may provide real-time guidance to the operator regarding how to properly position the ultrasound device on the subject to capture a medically relevant ultrasound image. For example, the operator may place the ultrasound device on the subject and receive feedback from the App regarding how to move the ultrasound device on the subject. The feedback may be a sequence of instructions each including a particular direction to move the ultrasound device (e.g., up, down, left, right, rotate clockwise, or rotate counter-clockwise)'; par 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device):
receive a current pose of an ultrasound device (Fig 1, ultrasound device 102) relative to an operator processing device (computing device 104 in Fig. 1; par 0164, 'a pose (e.g., position and/or orientation) of the ultrasound device in the captured image may be identified using an automated image processing technique (e.g., a deep learning technique)'; par 0236, 'the movement information from the sensors in both the ultrasound device and the computing device may be used in concert to identify the pose of the ultrasound device relative to the computing device and, thereby, identify the pose of the ultrasound device in the captured non-acoustic image') from the operator processing device (Fig. 1, 104; pat 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device; as the App provides the instructions shown in examples like Fig. 3B and 5B, the separate instruction device holding the App is capable of viewing those same instructions based upon the pose data sent from the computing device 104 to said App);
display, in an instruction interface (par 0310, 'The display screen 1508 may be configured to display images and/or videos') displayed on a display screen of the instructor processing device (par 0144, 'App'; Fig. 1; par 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device; as the App provides the instructions shown in examples like Fig. 3B and 5B, the separate instruction device holding the App is capable of viewing those same instructions based upon the pose data sent from the computing device 104 to said App), based on the current pose of the ultrasound device (Fig 1, 102) relative to the operator processing device (Fig 1, 104), an orientation indicator (such as symbol 314 in Fig. 3B or instructions 516 in Fig. 5B) indicating the current pose of the ultrasound device relative to the operator processing device (Fig 1, 104; Fig. 3B shows a direction symbol 314; Fig. 5B shows an instruction 516; par 0195, 'the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device'; par 0058, “the at least one processor is configured to generate the augmented reality interface at least in part by overlaying an instruction to the operator of the ultrasound device onto the image using the pose of the ultrasound device …the at least one processor is configured to analyze the at least one characteristics of the marker in the image at least in part by identifying a color of the marker in the image. In some embodiments, the at least one processor is configured to identify the pose of the ultrasound device at least in part by identifying an orientation of the ultrasound device in the image using the color of the marker in the image”, par 0164-0165, “a pose (e.g., position and/or orientation) of the ultrasound device in the captured image may be identified using an automated image processing technique (e.g., a deep learning technique) and the information regarding the pose of the ultrasound device may be used to overlay an instruction onto at least part of the ultrasound device in the captured image. Example instructions that may be overlaid onto the image of the ultrasound device include symbols (such as arrows) indicating a direction in which the operator is to move the device …. the operator may gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject”, par 0203, 'an instruction 516 indicative of a direction for the operator to move the ultrasound device 502, a symbol indicating a location of the target anatomical plane, and/or an ultrasound image 514 captured by the ultrasound device 502 may be overlaid onto the image 512'; par 0206, “the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position the instruction 516 in the augmented reality interface using the identified pose … The computing device 504 may identify the pose of the ultrasound device 502 in any of a variety of ways. In some embodiments, the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510. The location of the marker 510 may be identified by searching for one or more distinct characteristics of the marker 510 in the image 512. Additionally (or alternatively), the computing device may identify an orientation of the ultrasound device 502 in the image 512 by analyzing one or more characteristics of the marker 512. For example, the marker 510 may be a dispersive marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying a color of the marker 510 in the image 512. In another example, the marker 510 may be a holographic marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying an image presented by the marker 510 in the image 512. In yet another example, the marker 510 may be a patterned monochrome marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying an orientation of the pattern on the marker 510 in the image 512”, par 0238, 'the movement information from the sensors in both the ultrasound device and the computing device may be used in concert to identify the pose of the ultrasound device relative to the computing device and, thereby, identify the pose of the ultrasound device in the captured non-acoustic image', also par 0243),
the instruction interface comprises a circle, and the orientation indicator is at a particular position around the circle that indicates the current pose of the ultrasound device relative to the operator processing device (Fig 5B, par 0204-0206, “the display 508 in the computing device 504 displays an augmented reality interface comprising a non-acoustic image 512 of the ultrasound device 502 being used on the subject 501 (e.g., captured by the imaging device 506) and one or more elements overlaid onto the image 512. For example, an instruction 516 indicative of a direction for the operator to move the ultrasound device 502, a symbol indicating a location of the target anatomical plane, and/or an ultrasound image 514 captured by the ultrasound device 502 may be overlaid onto the image 512 …..the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position the instruction 516 in the augmented reality interface using the identified pose….. the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510. The location of the marker 510 may be identified by searching for one or more distinct characteristics of the marker 510 in the image 512. Additionally (or alternatively), the computing device may identify an orientation of the ultrasound device 502 in the image 512 by analyzing one or more characteristics of the marker 512. For example, the marker 510 may be a dispersive marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying a color of the marker 510 in the image 512”, Fig 7E, par 0217, “ The computing device 702 may also transition from the image acquisition screen shown in FIG. 7D to an image acquisition assistance screen shown in FIG. 7E. The image acquisition assistance screen may display an ultrasound image 726 captured using the ultrasound device. In some embodiments, the image acquisition assistance screen may display one or more instructions regarding how to reposition the ultrasound device to obtain an ultrasound image that contains the target anatomical view (e.g., a PLAX view). Once the ultrasound device has been properly positioned, the image acquisition assistance screen may display an indication that the ultrasound device is properly positioned”); and
display, in the operator video displayed on the instructor processing device, an instruction indicator based on the selected instruction (Figs 3A-3B, par 0194-0195, “FIG. 3A shows an example coarse instruction 302 that may be provided to an operator via a display 306 on a computing device 304. The coarse instruction 302 may be provided when the ultrasound device is positioned outside of a predetermined area on the subject. As shown, the coarse instruction 302 includes an indication of where the operator should position the ultrasound device on the subject to be within the predetermined area. In particular, the coarse instruction 302 comprises a symbol 308 (e.g., a star) showing where the predetermined region is located on a graphical image of the subject 301. The coarse instruction 302 also includes a message 310 with an arrow pointing to the symbol 308 instructing the operator to “POSITION ULTRASOUND DEVICE HERE” to communicate to the operator that the ultrasound device should be placed where the symbol 308 is located on the graphical image of the subject 301. ….the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device. The symbol 314 may be animated in some implementations. For example, the symbol 314 (e.g., an arrow and/or model of the ultrasound device) may move in a direction in which the ultrasound device is to be moved. The fine instruction 312 may also comprise a message 316 that compliments the symbol 314 such as the message “TURN CLOCKWISE.” The symbol 314 and/or the message 316 may be overlaid onto a background image 311.”)
Regarding claim 8, Rothberg et al. teaches the method of claim 7, and Rothberg et al. further teach wherein the instructor processing device is configured, when displaying the orientation indicator indicating the current pose of the ultrasound device relative to the operator processing device, to: determine, based on the current pose of the ultrasound device relative to the operator processing device, two points in three-dimensional space along an axis of the ultrasound device; project the two points in three-dimensional space into two two-dimensional points in an operator video captured by the operator processing device (fig 5B; para 158, the last 12 lines; para 159, lines 1-10; para 186, lines 1-15; para 204); and display the orientation indicator at an angle relative to a horizontal axis of the display screen of the instructor processing device that is equivalent to an angle between a line formed by the two two-dimensional points and the horizontal axis of the display screen of the instructor processing device (figs 3B and 5B; para 186, the last 15 lines; para 189, lines 1-19; para 195, lines 1-11; para 209).
Regarding Claim 9, Rothberg teach all the limitation of claim 7, and Rothberg et al. further teach wherein the orientation indicator (Fig 5B, 516) indicates a direction a marker (marker 510 in Fig. 5B) on the ultrasound device (502 in Fig. 5B) is pointing relative to the operator processing device (504 in Fig. 5B); par 0204, 'the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position 'the instruction 516 in the augmented reality interface using the identified pose ... The computing device 504 may identify the pose of the ultrasound device 502 in any of a variety of ways. In some embodiments, the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510').
Regarding Claim 10, Rothberg teach all the limitation of claim 1, and Rothberg et al. further teach wherein the orientation indicator illustrates two- dimensionally a three-dimensional pose of a marker on the ultrasound device (Fig. 5B, par 0204, 'the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position 'the instruction 516 in the augmented reality interface using the identified pose ... The computing device 504 may identify the pose of the ultrasound device 502 in any of a variety of ways. In some embodiments, the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510').
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-5 and 21-25 is/are rejected under 35 U.S.C. 103 as being unpatentable over U.S. PGPubs 2017/0360412 to Rothberg et al. in view of U.S. PGPubs 2017/0105701 to Pelissier et al..
Regarding Claim 1, Rothberg et al. teach an ultrasound system (ultrasound system 100 in Fig. 1; the system and its various elements are numbered differently in later Figures, but is generally compatible throughout the Figures as described in par 0341, par 0346), comprising: an instructor processing device configured to (par 0144, 'App'; par 0005, 'The App may provide real-time guidance to the operator regarding how to properly position the ultrasound device on the subject to capture a medically relevant ultrasound image. For example, the operator may place the ultrasound device on the subject and receive feedback from the App regarding how to move the ultrasound device on the subject. The feedback may be a sequence of instructions each including a particular direction to move the ultrasound device (e.g., up, down, left, right, rotate clockwise, or rotate counter-clockwise)'; par 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device):
receive a current pose of an ultrasound device (Fig 1, ultrasound device 102) relative to an operator processing device (computing device 104 in Fig. 1; par 0164, 'a pose (e.g., position and/or orientation) of the ultrasound device in the captured image may be identified using an automated image processing technique (e.g., a deep learning technique)'; par 0236, 'the movement information from the sensors in both the ultrasound device and the computing device may be used in concert to identify the pose of the ultrasound device relative to the computing device and, thereby, identify the pose of the ultrasound device in the captured non-acoustic image') from the operator processing device (Fig. 1, 104; pat 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device; as the App provides the instructions shown in examples like Fig. 3B and 5B, the separate instruction device holding the App is capable of viewing those same instructions based upon the pose data sent from the computing device 104 to said App);
display, in an operator video (par 0310, 'The display screen 1508 may be configured to display images and/or videos') displayed on the instructor processing device (par 0144, 'App'; Fig. 1; par 0146, 'In another embodiment, the App may be executed on a cloud and communicated to the operator through the smart device. In yet another embodiment, the App may be executed on the ultrasound device itself and the instructions may be communicated to the user either through the ultrasound device itself or a smart device associated with the ultrasound device. Thus, it should be noted that the execution of the App may be at a local or a remote device without departing from the disclosed principles'; therefore, the App acts as the instructor and may be provided on a separate device that is not the operator processing device; as the App provides the instructions shown in examples like Fig. 3B and 5B, the separate instruction device holding the App is capable of viewing those same instructions based upon the pose data sent from the computing device 104 to said App), based on the current pose of the ultrasound device (Fig 1, 102) relative to the operator processing device (Fig 1, 104), an orientation indicator (such as symbol 314 in Fig. 3B or instructions 516 in Fig. 5B) indicating the current pose of the ultrasound device relative to the operator processing device (Fig 1, 104; Fig. 3B shows a direction symbol 314; Fig. 5B shows an instruction 516; par 0195, 'the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device'; par 0058, “the at least one processor is configured to generate the augmented reality interface at least in part by overlaying an instruction to the operator of the ultrasound device onto the image using the pose of the ultrasound device …the at least one processor is configured to analyze the at least one characteristics of the marker in the image at least in part by identifying a color of the marker in the image. In some embodiments, the at least one processor is configured to identify the pose of the ultrasound device at least in part by identifying an orientation of the ultrasound device in the image using the color of the marker in the image”, par 0164-0165, “a pose (e.g., position and/or orientation) of the ultrasound device in the captured image may be identified using an automated image processing technique (e.g., a deep learning technique) and the information regarding the pose of the ultrasound device may be used to overlay an instruction onto at least part of the ultrasound device in the captured image. Example instructions that may be overlaid onto the image of the ultrasound device include symbols (such as arrows) indicating a direction in which the operator is to move the device …. the operator may gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject”, par 0203, 'an instruction 516 indicative of a direction for the operator to move the ultrasound device 502, a symbol indicating a location of the target anatomical plane, and/or an ultrasound image 514 captured by the ultrasound device 502 may be overlaid onto the image 512'; par 0206, “the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position the instruction 516 in the augmented reality interface using the identified pose … The computing device 504 may identify the pose of the ultrasound device 502 in any of a variety of ways. In some embodiments, the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510. The location of the marker 510 may be identified by searching for one or more distinct characteristics of the marker 510 in the image 512. Additionally (or alternatively), the computing device may identify an orientation of the ultrasound device 502 in the image 512 by analyzing one or more characteristics of the marker 512. For example, the marker 510 may be a dispersive marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying a color of the marker 510 in the image 512. In another example, the marker 510 may be a holographic marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying an image presented by the marker 510 in the image 512. In yet another example, the marker 510 may be a patterned monochrome marker and the computing device may identify an orientation of the ultrasound device 502 in the image 512 by identifying an orientation of the pattern on the marker 510 in the image 512”, par 0238, 'the movement information from the sensors in both the ultrasound device and the computing device may be used in concert to identify the pose of the ultrasound device relative to the computing device and, thereby, identify the pose of the ultrasound device in the captured non-acoustic image', also par 0243);
PNG
media_image1.png
316
322
media_image1.png
Greyscale
the orientation indicator comprises a ring (Fig 5B); the ring is centered in the operator video approximately at a tail of the ultrasound device and orientated approximately within a plane orthogonal to a longitudinal axis of the ultrasound device (Fig 5B);
display, on the instructor processing device, an instruction interface for selecting an instruction to alter the position of an ultrasound device based on the current pose of the ultrasound device (par 0180, “the computing device 104 may provide an instruction 108 using the display 106 to the operator regarding how to reposition the ultrasound device 102”, Fig 1,par 0185, “the computing device 104 may generate the ultrasound image 110 using the received ultrasound data and analyze the ultrasound image 110 using an automated image processing technique to generate the instruction 108 regarding how the operator should re-position the ultrasound device 102 to capture an ultrasound image containing the target anatomical view …..the computing device 104 may use a machine learning technique (such as a deep learning technique) to directly map the ultrasound image 110 to an output to provide to the user such as an indication of proper positioning or an instruction to reposition the ultrasound device 102 (e.g., instruction 108)”, Figs 3A-3B, par 0194-0195, “the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device. The symbol 314 may be animated in some implementations. For example, the symbol 314 (e.g., an arrow and/or model of the ultrasound device) may move in a direction in which the ultrasound device is to be moved. The fine instruction 312 may also comprise a message 316 that compliments the symbol 314 such as the message “TURN CLOCKWISE.” The symbol 314 and/or the message 316 may be overlaid onto a background image 311. The background image 311 may be, for example, an ultrasound image generated using ultrasound data received from the ultrasound device”, Fig 5B, par 0204-0206,” the display 508 in the computing device 504 displays an augmented reality interface comprising a non-acoustic image 512 of the ultrasound device 502 being used on the subject 501 (e.g., captured by the imaging device 506) and one or more elements overlaid onto the image 512. For example, an instruction 516 indicative of a direction for the operator to move the ultrasound device 502, a symbol indicating a location of the target anatomical plane, and/or an ultrasound image 514 captured by the ultrasound device 502 may be overlaid onto the image 512” );
providing, to the operator processing device, the selected instruction (Fig 9, par 0231-0232, “the computing device may provide an instruction to reposition the ultrasound device to the operator. The instruction may be, for example, an audible instruction played through a speaker, a visual instruction displayed using a display, and/or a tactile instruction provided using a vibration device (e.g., integrated into the computing device and/or the ultrasound device). The instruction may be provided based on, for example, the sequence of instructions in the guidance plan generated in act 906. For example, the computing device may identify a single instruction from the sequence of instructions and provide the identified instruction”); and
display, in the operator video displayed on the instructor processing device, an instruction indicator based on the selected instruction (Figs 3A-3B, par 0194-0195, “FIG. 3A shows an example coarse instruction 302 that may be provided to an operator via a display 306 on a computing device 304. The coarse instruction 302 may be provided when the ultrasound device is positioned outside of a predetermined area on the subject. As shown, the coarse instruction 302 includes an indication of where the operator should position the ultrasound device on the subject to be within the predetermined area. In particular, the coarse instruction 302 comprises a symbol 308 (e.g., a star) showing where the predetermined region is located on a graphical image of the subject 301. The coarse instruction 302 also includes a message 310 with an arrow pointing to the symbol 308 instructing the operator to “POSITION ULTRASOUND DEVICE HERE” to communicate to the operator that the ultrasound device should be placed where the symbol 308 is located on the graphical image of the subject 301. ….the fine instruction 312 includes a symbol 314 indicating which direction the operator should move the ultrasound device. The symbol 314 may be animated in some implementations. For example, the symbol 314 (e.g., an arrow and/or model of the ultrasound device) may move in a direction in which the ultrasound device is to be moved. The fine instruction 312 may also comprise a message 316 that compliments the symbol 314 such as the message “TURN CLOCKWISE.” The symbol 314 and/or the message 316 may be overlaid onto a background image 311.”)
But Rothberg keeps silent for teaching the orientation indicator comprises a ring and a ball; the ring is centered in the operator video approximately at a tail of the ultrasound device and orientated approximately within a plane orthogonal to a longitudinal axis of the ultrasound device; and the ball is located in the operator video on the ring such that a line from the ring to the marker on the ultrasound device is approximately parallel to the longitudinal axis of the ultrasound device, and communicating, to the operator processing device, the selected instruction.
PNG
media_image2.png
434
266
media_image2.png
Greyscale
PNG
media_image3.png
305
624
media_image3.png
Greyscale
In related endeavor, Pelissier et al. teach the orientation indicator comprises a ring and a ball; the ring is centered in the operator video approximately at a tail of the ultrasound device and orientated approximately within a plane orthogonal to a longitudinal axis of the ultrasound device; and the ball is located in the operator video on the ring such that a line from the ring to the marker on the ultrasound device is approximately parallel to the longitudinal axis of the ultrasound device (Fig. 10B, par 0107, “Probe 103B comprises an array of small lights 101-6 that are located where they can be seen by a user who is holding body 101-1 to perform an ultrasound scan. Selected lights 101-6 may be turned on to provide static or dynamically varying patterns selected to indicate motions corresponding to messages received from remote interface 118. For example, lights 101-6 on the left side of probe 103B may be controlled to blink to indicate motion to the left; lights 101-6 on the right side of probe 103B may be controlled to blink to indicate motion to the right; lights 101-6 on the front side of probe 103B may be controlled to blink to indicate forward tilt; lights 101-6 on the back side of probe 103B may be controlled to blink to indicate backward tilt; lights 101-6 may be controlled in a clockwise rotating pattern to indicate clockwise rotation; and lights 101-6 may be controlled in a counterclockwise rotating pattern to indicate counterclockwise rotation”), communicating, to the operator processing device, the selected instruction (par 0105-0107, “Probe 103A comprises a small display 101-5 which is located where it can be seen by a user who is holding body 101-1 to perform an ultrasound scan. Display 101-5 may display predetermined static or moving images indicating motions corresponding to messages received from remote interface 118 ….Selected lights 101-6 may be turned on to provide static or dynamically varying patterns selected to indicate motions corresponding to messages received from remote interface 118” ….provide instructions to scan device from remote device or computer device).
It would have been obvious to a person of ordinary skill in the art at the time before the effective filing data of the claimed invention to modified Rothberg to include the orientation indicator comprises a ring and a ball; the ring is centered in the operator video approximately at a tail of the ultrasound device and orientated approximately within a plane orthogonal to a longitudinal axis of the ultrasound device; and the ball is located in the operator video on the ring such that a line from the ring to the marker on the ultrasound device is approximately parallel to the longitudinal axis of the ultrasound device, and communicating, to the operator processing device, the selected instruction as taught by Pelissier et al. to allow a remote operator to view a real-time stream of ultrasound images to provide visual feedback to help the inexperienced operator to modify and improve his or her ultrasound scanning technique.
Regarding Claim 2, Rothberg et al. as modified by Pelissier et al. teach all the limitation of claim 1, and Rothberg et al. further teach wherein the operator video depicts the ultrasound device (Fig 1, par 0181, par 102, Fig. 5B shows a video with instruction 516 where the ultrasound device is clearly depicted in the video; par 0203, 'the display 508 in the computing device 504 displays an augmented reality interface comprising a non-acoustic image 512 of the ultrasound device 502 being used on the subject 501 (e.g., captured by the imaging device 506) and one or more elements overlaid onto the image 512'; par 0310, 'The display screen 1508 may be configured to display images and/or videos').
Regarding Claim 3, Rothberg et al. as modified by Pelissier et al. teach all the limitation of claim 1, and Rothberg et al. further teach wherein the orientation indicator is displayed in the operator video such that the orientation indicator appears to be a part of a real-world environment in the operator video (Fig. 5B, par 0162, 'Accordingly, certain disclosed embodiments relate to new techniques for providing instructions to an operator of an ultrasound device through an augmented reality interface. In the augmented reality interface, the instructions may be overlaid onto a view of the operator's real-world environment. For example, the augmented reality interface may include a view of the ultrasound device positioned on the subject and an arrow indicative of the particular direction that the ultrasound device should be moved.'; par 0164, “the method may further include generating a composite image at least in part by overlaying, onto the image of the ultrasound device, at least one instruction indicating how the operator is to reposition the ultrasound device. For example, a pose (e.g., position and/or orientation) of the ultrasound device in the captured image may be identified using an automated image processing technique (e.g., a deep learning technique) and the information regarding the pose of the ultrasound device may be used to overlay an instruction onto at least part of the ultrasound device in the captured image“, par 0203, 'the display 508 in the computing device 504 displays an augmented reality interface comprising a non-acoustic image 512 of the ultrasound device 502 being used on the subject 501 (e.g., captured by the imaging device 506) and one or more elements overlaid onto the image 512').
PNG
media_image4.png
379
381
media_image4.png
Greyscale
Regarding Claim 4, Rothberg et al. as modified by Pelissier et al. teach all the limitation of claim 1, and Rothberg et al. further teach wherein the instructor processing device is configured, when displaying the orientation indicator indicating the current pose of the ultrasound device relative to the operator processing device, to: determine a default position and orientation of the orientation indicator in three- dimensional space for a default pose of the ultrasound device relative to the operator processing device (para 158, the last 12 lines; para 186, lines 1-15; para 189, lines 1-5, fig 5B, direction indicator 516; the circular directional arrow is centered on the ultrasound device near the base of the device and is orthogonal to the ultrasound device); position and/or orient the orientation indicator in three-dimensional space from the default position and orientation based on a difference between the current pose and the default pose of the ultrasound device relative to the operator processing device; and project the orientation indicator from its three-dimensional position and orientation into two-dimensional space for display in the operator video displayed on the instruction processing device (Figs 3B and 5B; para 151, the last 10 lines; para 186, the last 10 lines; para 190).
Regarding Claim 5, Rothberg et al. as modified by Pelissier et al. teach all the limitation of claim 1, and Rothberg et al. further teach wherein the orientation indicator (Fig 5B, 516) indicates an orientation of the marker (marker 510 in Fig. 5B) on the ultrasound device (502 in Fig. 5B) relative to the operator processing device (504 in Fig. 5B); par 0204, 'the instruction 516 may be overlaid onto the image 512 such that at least a portion of the instruction 516 is overlaid onto the ultrasound device 502 in the image 512. The computing device 504 may, for example, use the marker 510 to identify a pose (e.g., a position and/or orientation) of the ultrasound device 502 in the image 512 and position 'the instruction 516 in the augmented reality interface using the identified pose ... The computing device 504 may identify the pose of the ultrasound device 502 in any of a variety of ways. In some embodiments, the computing device may identify a position of the ultrasound device 502 in the image 512 by identifying a location of the marker 510').
Regarding Claim 21, Rothberg as modified by Pelissier et al. teach all the limitation of claim 1, and Pelissier et al. further teach wherein the instruction interface includes an option to select a type of instruction (Figs 4 and 5A, par 0080-0081, “ These discrete messages may be selected from a set of predetermined discrete messages. The predetermined messages can include messages to change the position of probe 103. In the illustrated embodiment, remote interface 118 includes a position feedback selector 120 configured to allow a remote expert to trigger remote interface 118 to send selected discrete messages to apparatus 100A”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding Claim 22, Rothberg as modified by Pelissier et al. teach all the limitation of claim 21, and Pelissier et al. further teach wherein the type of instruction may be selected from a group consisting of, a rotate option, a tilt option, a move option, a draw option, and a text option (Figs 4 and 5A-5D, par 0080-0089, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left”, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left”, par 0115-0116, “text messages may be displayed on a display of apparatus 101A, for example on screen 102A. The text messages may include predetermined statements such as such as ‘scan left’; ‘scan right’; ‘tilt forward’ etc. The text messages may be delivered in a language corresponding to a language setting for local user interface 102; [0116] tactile feedback may be provided. In some embodiments the tactile feedback is provided by transducers such as piezoelectric transducers and/or heaters on a housing of probe 103 that may be selectively actuated to change the texture, temperature and/or vibration patterns sensed by a user holding probe 103 in a manner that maps intuitively to suggested movements”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding Claim 23, Rothberg as modified by Pelissier et al. teach all the limitation of claim 21, and Pelissier et al. further teach wherein the instruction interface is configured to display an alternate interface in response to the type of instruction selected (Figs 3-4, par 0077-0080, “The display includes ultrasound image 402 and video stream 404 as well as controls that may be used by an expert to provide direction and feedback to a user. FIG. 4 includes a plurality of graphical feedback elements 121 for selection by an expert. These feedback elements may be actuated by the expert to generate predetermined messages”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding Claim 24, Rothberg as modified by Pelissier et al. teach all the limitation of claim 23, and Pelissier et al. further teach wherein the alternate interface may be selected from a group consisting of, a rotation interface, a tilt interface, and a translation interface (Figs 4 and 5A-5D, par 0080-0089, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left”, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left”). This would be obvious for the same reason given in the rejection for claim 1.
Regarding Claim 25, Rothberg as modified by Pelissier et al. teach all the limitation of claim 23, and Pelissier et al. further teach wherein the alternate interface may display the orientation indicator within the alternate interface (Figs 4 and 5A-5D, par 0080-0100, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left”, “In a preferred embodiment the discrete messages include messages signifying: [0081] move probe 103 to the left; [0082] move probe 103 to the right; [0083] move probe 103 toward the front; [0084] move probe 103 toward the back; [0085] rotate probe 103 clockwise; [0086] rotate probe 103 counterclockwise; [0087] tilt probe 103 forward; [0088] tilt probe 103 backward; [0089] rock probe 103 to the right; [0090] rock probe 103 to the left. FIG. 5A shows an example feedback selector 120. In the illustrated embodiment, feedback selector 120 has the form of a palette comprising several individual controls 121. Controls 121-SL, 121-SR, 121-SF, and 121-SB respectively trigger messages for moving (sliding) probe 103 to the left, to the right forward and backward; controls 121-CW and 121-CCW respectively trigger messages for rotating probe 103 clockwise and counterclockwise; controls 121-TF and 121-TB respectively trigger messages for tilting probe 103 forward and backward; and controls 121-RR and 121-RL respectively trigger messages for rocking probe 103 to the right and to the left”). This would be obvious for the same reason given in the rejection for claim 1.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jin Ge whose telephone number is (571)272-5556. The examiner can normally be reached 8:00 to 5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Chan can be reached at (571)272-3022. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
JIN . GE
Examiner
Art Unit 2619
/JIN GE/Primary Examiner, Art Unit 2619