Prosecution Insights
Last updated: April 19, 2026
Application No. 18/655,190

REAL-TIME REMOTE GUIDANCE

Non-Final OA §102§103§112
Filed
May 03, 2024
Examiner
MCDONALD, JAMES F
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Exo Imaging Inc.
OA Round
1 (Non-Final)
55%
Grant Probability
Moderate
1-2
OA Rounds
3y 6m
To Grant
99%
With Interview

Examiner Intelligence

Grants 55% of resolved cases
55%
Career Allow Rate
42 granted / 76 resolved
-14.7% vs TC avg
Strong +44% interview lift
Without
With
+44.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
33 currently pending
Career history
109
Total Applications
across all art units

Statute-Specific Performance

§101
5.1%
-34.9% vs TC avg
§103
41.5%
+1.5% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
32.1%
-7.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 76 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Applicant’s election of inventive Group I (claims 1-14) and Species C (claim 9), without traverse, in the reply filed on 11/20/2025 is respectfully acknowledged. Upon review, the prior species requirement has been withdrawn. Accordingly, claims 1-14 remain pending for examination on the merits. Claim Objections Claim(s) 1 and 10 are objected to for inconsistent language. Claim 1 recites the limitation “during a medical procedure” in the preamble, but in subsequent limitations recites “representative of a procedure being performed by the local user, to the skilled resource, wherein the local user performs the procedure”. The claim language must be amended by replacing the phrase ‘a procedure’ with –the medical procedure– for consistency throughout the claim. Similarly, claim 10 recites ‘a medical procedure’ in the preamble and the subsequent clause. The limitation “performing a medical procedure using the medical device” must be amended to –performing the medical procedure using the medical device–. Appropriate correction is required. Claim(s) 14 recite(s) the limitation “such that […]”. It is suggested to replace the phrase “such that” with the term —wherein— to ensure the positive recitation of all elements in the claim. The use of the phrase “such that” may be interpreted as a negative limitation in the claim, resulting in an interpretation of subsequent limitations (i.e., “such that when the medical device operator has oriented the medical device in the same way as the remote source, the medical device operator receives haptic feedback in their medical device” in claim 14) as preferred or suggested limitations, and therefore may be excluded from examination. Claim Rejections - 35 USC § 112 35 USC § 112(b) The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim(s) 1-9 and 10-14 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 2-914 are also rejected at least by virtue of dependency upon a rejected claim. Claim 1 recites the limitations “permitting the skilled resource to transmit feedback to the local user; and conveying the feedback to the local user via a communication interface”, which renders the claim indefinite. The use of the term ‘permitting’ is broad, and within the context of the claim language ‘permitting the skilled resource to transmit feedback to the local user’ may be interpretated to be functionally analogous to the subsequent limitation of ‘conveying the feedback to the local user’, or to ‘approve’ the transmission without actually performing the transmission of feedback. The distinction between the ‘permitting’ and the ‘conveying’ clauses is unclear, as both appear to result in the transmission of ‘feedback’ to the local user. It is suggested to amend the claim to define the ‘permitting’ function and to describe the relationship/nature with regards to the ‘feedback’. For the purposes of examination, the broadest reasonable interpretations of the ‘permitting’ and ‘conveying’ language may be the transmission of any data/information stream to the ‘local user’. Claim 10 recites the limitations “the shared application permitting sharing and viewing of the video; and permitting the remote source control of an operation or attribute of the medical device via the shared application”, which renders the claim indefinite for similar reasons as discussed regarding claim 1 above. The use of the term “permitting” is broad, and it is unclear if the term refers to the transmission of information or to an active step of approving or acknowledging the subsequent functions of “sharing and viewing of the video” and “the remote source control of an operation or attribute” (e.g., transmission of information data, approving the transmission of video and/or approving the remote source control, etc.). It is suggested to amend the claim to define the ‘permitting’ function and to clearly describe the relationship/nature with regards to the ‘sharing’ and ‘remote source control’ functions. For the purposes of examination, the broadest reasonable interpretations of the ‘permitting’ language may be the transmission or approval of ‘video’ or ‘remote source control’. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-8 is/are rejected under 35 U.S.C. 102(a)(1) as being clearly anticipated by Rothberg et al. (US20170360401A1, 2017-12-21; hereinafter “Rothberg”). Regarding claim 1, Rothberg teaches a method for providing remote guidance from a skilled resource to a local user during a medical procedure on a patient for enhancing quality of the medical procedure performed by the local user (“A method, comprising: […] generating, using the ultrasound image, a guidance plan for how to guide an operator of the ultrasound device to capture an ultrasound image of the subject containing the target anatomical view” [clm 20]; “techniques for guiding an operator to use an ultrasound device” [abst]; “The disclosure provides techniques for instructing an operator of an ultrasound device how to position the ultrasound device on a subject to capture a medically relevant ultrasound image. […] The guidance may be provided via a software application (hereinafter “App”) installed on a computing device of the operator (such as: a mobile device, a smartphone or smart-device, tablet, etc.)” [0144]; “the App may be executed on a cloud and communicated to the operator through the smart device. […] the execution of the App may be at a local or a remote device” [0146]; [0140-0180], [fig. 1, 5A-5B, 15A-15B, 9]), the method comprising: transmitting information, representative of a procedure being performed by the local user, to the skilled resource, wherein the local user performs the procedure using a medical device (“the ultrasound device may generate ultrasound sound data and transmit (via a wired or wireless communication link) the ultrasound data to the computing device” [0148]; “The ultrasound device 102 may transmit ultrasound data to the computing device 104 using the communication link 112. The communication link 112 may be a wired (or wireless) communication link.” [0183]; “the ultrasound device may be initially positioned on a subject 201 at an initial position 202 (on a lower torso of the subject 201)” [0186]; “The computing device 1502 may be connected to the network 1516 over a wired connection (e.g., via an Ethernet cable) and/or a wireless connection (e.g., over a WiFi network). […] the computing device 1502 may send an ultrasound image over the network 1516 to the server 1518 for analysis (e.g., to identify an anatomical feature in the ultrasound image and/or identify an instruction to provide the operator) and receive the results of the analysis from the server 1518.” [0312]; The operator positions the ultrasound device at an initial position to begin an ultrasound examination, wherein ultrasound data and the location of the ultrasound device (i.e., information) are acquired and transmitted over network to external remote device(s) (e.g., server, workstation, etc.) via the computing device [0181-0232, 0300-0315], [fig. 1, 5A-5B, 9, 15A-15B; see fig. 15B reproduced below]); permitting the skilled resource to transmit feedback to the local user (“determining whether the ultrasound image contains a target anatomical view; […] generating, using the ultrasound image, a guidance plan for how to guide an operator of the ultrasound device to capture an ultrasound image of the subject containing the target anatomical view;” [clm 20]; “The guidance may be provided via a software application (hereinafter “App”) […] The operator may then position the ultrasound device on the subject and the software application (via the computing device) may provide feedback to the operator indicating whether the operator should reposition the ultrasound device and how he/she should proceed to do so.” [0144]; “the App may be executed on a cloud and communicated to the operator through the smart device. […] the execution of the App may be at a local or a remote device” [0146]; The App (i.e., skilled resource) is a software application which may be executed at an external device (e.g., server), wherein the App receives ultrasound image information to determine a guidance plan for transmission to the operator to acquire the targeted anatomical view [0181-0232, 0300-0315], [fig. 1, 5A-5B, 9, 15A-15B; see fig. 9 reproduced below]); and PNG media_image1.png 752 734 media_image1.png Greyscale Flowchart describing method of remotely guiding an operator through an ultrasound examination (Rothberg [fig. 9]) conveying the feedback to the local user via a communication interface (“generating, using the ultrasound image, a guidance plan for how to guide an operator of the ultrasound device to capture an ultrasound image of the subject containing the target anatomical view; and providing at least one instruction to the operator based on the generated guidance plan.” [clm 20]; “The computing device 1502 may be connected to the network 1516 […] the computing device 1502 may send an ultrasound image over the network 1516 to the server 1518 for analysis (e.g., to identify an anatomical feature in the ultrasound image and/or identify an instruction to provide the operator) and receive the results of the analysis from the server 1518.” [0312]; The App executing on external device may receive and perform analysis on ultrasound data to determine operator instructions, wherein the computing device receives the instructions via network connection (i.e., communication interface) and provides the instructions to the operator during the ultrasound imaging examination [0181-0232, 0300-0315], [fig. 1, 5A-5B, 9, 15A-15B; see fig. 15B reproduced below]). PNG media_image2.png 536 854 media_image2.png Greyscale Ultrasound device 1514 acquires and transmits ultrasound data to external device(s) using computing device with network connection (Rothberg [fig. 15B]) Regarding claim 2, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the permitting the skilled resource to transmit feedback to the local user comprises permitting the skilled resource to indicate at least one suggested position or movement of the medical device (“wherein generating the guidance plan comprises identifying, using the identified anatomical view, a direction in which to move the ultrasound device, and wherein providing the at least one instruction to the operator comprises providing an instruction to the operator to move the ultrasound device in the identified direction” [clm 22]; “wherein identifying the direction in which to move the ultrasound device comprises identifying a translational direction or a rotational direction in which to move the ultrasound device” [clm 23]; [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9, 15A-15B; see fig. 3B, 5B reproduced below], [see claim 1 rejection]). PNG media_image3.png 547 423 media_image3.png Greyscale PNG media_image4.png 764 786 media_image4.png Greyscale Examples of movement instructions provided to the operator to reposition and orient the ultrasound device (Rothberg [fig. 3A, 5B]) Regarding claim 3, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the transmitting information comprises sending a video feed to the skilled resource of the local user performing the medical procedure (“the method further comprises displaying a plurality of composite images in real time. […] the method further comprises providing instructions in real time based on the plurality of composite images, wherein the instructions guide a user of the ultrasound probe in acquisition of subsequent ultrasound images of the portion of the patient's body” [0110]; “The computing device 504 may use the tracked location of the ultrasound device in the non-acoustic images to overlay one or more elements (e.g., instructions) onto the non-acoustic images to form an augmented reality interface” [0201]; “Additionally (or alternatively), the processor 1510 may control the imaging device 1506 to capture non-acoustic images of the ultrasound device 1514 being used on a subject to provide an operator of the ultrasound device 1514 an augmented reality interface” [0310]; A plurality of non-acoustic images (i.e., video feed) are taken of the operator performing ultrasound imaging to generate composite images and to inform the instructions that guide the operator [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B], [see claim 2 rejection]). Regarding claim 4, Rothberg teaches the method of Claim 3, Rothberg further teaching wherein the video feed is a live video feed (“the method further comprises displaying a plurality of composite images in real time. In some embodiments, the composite images are displayed on an augmented reality display.” [0110]; “The marker advantageously allow the computing device 504 to more easily track the location of the ultrasound device in non-acoustic images captured by an imaging device 506 (e.g., integrated into the computing device 504)” [0201]; “The ultrasound data may be processed in real-time during a scanning session as the echo signals are received.” [0308]; A plurality of non-acoustic images are acquired and processed in real-time (i.e., live video feed) simultaneously with ultrasound images, which are used to generate a plurality of composite images which provide instructions that a user of the ultrasound probe in acquisition of subsequent ultrasound images of the portion of the patient's body [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B], [see claim 3 rejection]). Regarding claim 5, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the permitting the skilled resource to transmit feedback comprises the skilled resource overlaying notation over the information transmitted to the skilled resource (“generating a composite image at least in part by overlaying, onto the image of the ultrasound device, at least one instruction indicating how the operator is to reposition the ultrasound device; and presenting the composite image to the operator.” [0035]; “The one or more elements overlaid onto the non-acoustic image may be, for example, one or more instructions designed to provide feedback to the operator regarding how to reposition the ultrasound device to obtain an ultrasound image that contains a target anatomical view.” [0235]; [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B; see fig.5B reproduced below], [see claim 1, 2 rejections]). PNG media_image4.png 764 786 media_image4.png Greyscale Movement instructions provided to the operator to reposition and orient the ultrasound device may overlaid on non-acoustic imaging of ultrasound device during an examination (Rothberg [fig. 5B]) Regarding claim 6, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the conveying the feedback comprises providing visual feedback to the local user using a first communication device (“the computing device may provide an instruction to reposition the ultrasound device to the operator. The instruction may be, for example, an audible instruction played through a speaker, a visual instruction displayed using a display, and/or a tactile instruction provided using a vibration device (e.g., integrated into the computing device and/or the ultrasound device).” [0230]; The computing device (i.e., first communication device) comprises a display screen for presenting instructions to the operator [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B; see fig. 15B reproduced below], [see claim 1 rejection]). PNG media_image2.png 536 854 media_image2.png Greyscale The computing device comprises vibration device, audio output device and display screen to provide instructions to the operator during ultrasound examination (Rothberg [fig. 15B]) Regarding claim 7, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the skilled resource comprises a skilled user of the medical device (“The computing device may be a mobile smartphone, a tablet, a laptop, a workstation, or any other suitable computing device.” [0147]; “The diagnostic application may be configured to assist the subject to operate the ultrasound device and store (and/or upload) the captured ultrasound images for analysis by the physician. Thereby, the physician may be able to remotely monitor a condition of the subject without making the subject remain in inpatient care.” [0218]; “the computing device 1502 may communicate with one or more workstations 1520, servers 1518, and/or databases 1522.” [0306]; The diagnostic application App may be designed for use by a health care professional (i.e., skilled user) which is executed by a computing device (e.g., a workstation) accessed remotely over the network [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9, 15A-15B]). Regarding claim 8, Rothberg teaches the method of Claim 1, Rothberg further teaching wherein the skilled resource comprises an artificial intelligence (“Example automated image processing techniques include machine learning techniques such as deep learning techniques. […] the convolutional neural network may be trained with a set of ultrasound images labeled with the particular anatomical view depicted in the ultrasound image” [0149]; “the computing device may identify the initial position of the ultrasound device on the subject at least in part by: identifying an anatomical view contained in the ultrasound image (e.g., using deep learning techniques) and map the identified anatomical view to a position on the subject” [0229]; “Aspects of the technology described herein relate to the application of automated image processing techniques to analyze images, such as ultrasound images and non-acoustic images. In some embodiments, the automated image processing techniques may comprise machine learning techniques such as deep learning techniques.” [0257]; Deep learning techniques (i.e., artificial intelligence) may be applied to analyze both ultrasound image data and non-acoustic images, and to determine operator instructions [0144-0153, 0171-0174, 0225-0244, 0257-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B]). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 9 and 10-14 is/are rejected under 35 U.S.C. 103 as being obvious over Rothberg (regarding claim 9, as applied to claim 1 above), in view of Pelissier et al. (US20170105701A1, 2017-04-20; hereinafter “Pelissier”) as provided by Applicant. Regarding claim 9, Rothberg teaches the method of Claim 1, but Rothberg may fail to explicitly teach permitting control of at least one attribute of the medical device by the skilled resource. However, in the same field of endeavor, Pelissier teaches a method for providing remote guidance from a skilled resource to a local user during a medical procedure on a patient for enhancing quality of the medical procedure performed by the local user (“A method for providing positional feedback to an operator performing an ultrasound scan of a patient” [clm 25]; “System 100 comprises apparatus 100A at a location where a user performs an ultrasound scan on a patient and apparatus 100B at a remote location. An expert may use apparatus 100B to review information about an ultrasound procedure in real time as the ultrasound procedure is being performed at apparatus 100A and to provide real time feedback to a user of apparatus 100A.” [0061]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11]); Pelissier further teaching wherein the permitting the skilled resource to transmit feedback comprises permitting control of at least one attribute of the medical device by the skilled resource (“When remote feedback is desired apparatus 100A is caused to transmit one or both of the ultrasound imaging data and the video stream through a data communication network 106 to remote apparatus 100B.” [0070]; “patient imaging device 104 is adjustable (for example has adjustable pan and/or zoom) and remote interface device 118 includes controls that allow the expert to remotely adjust patient imaging device 104 to obtain a clear view of the procedure or a view that can best help the user” [0074]; “the remote expert may provide feedback to the user by way of remote user interface 118.” [0076]; The remote expert (i.e., skilled resource) may monitor the ultrasound scan and transmit feedback to the user (e.g., adjust patient imaging device, demonstration of particular technique, send discrete messages to a display on the local user’s device, etc.) to instruct the user to improve ultrasound scan [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11]). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to combine the method for providing remote guidance taught by Rothberg by permitting control of at least one attribute of the medical device by the skilled resource as taught by Pelissier. It can be difficult to properly capture and analyse ultrasound images. The successful use of ultrasound is dependent on highly-skilled technicians to perform the scans and experienced physicians to interpret them. It typically takes several years of training for a technician to become proficient (Pelissier [0003]). Trained ultrasound technicians are not universally accessible. Ultrasound could be used to improve care for more patients, especially in rural and low-resource settings, if less-skilled operators were able to perform ultrasound examinations quickly and accurately (Pelissier [0007]). Furthermore, holding the ultrasound device a few inches too high or too low on the subject may make the difference between capturing a medically relevant ultrasound image and capturing a medically irrelevant ultrasound image. As a result, non-expert operators of an ultrasound device may have considerable trouble capturing medically relevant ultrasound images of a subject (Rothberg [0004]). The technology improvements described may enable, among other capabilities, focused diagnosis, early detection and treatment of conditions by an ultrasound system (Rothberg [0209]), and may increase the effectiveness of remote feedback, thus helping inexperienced ultrasound users to capture accurate images in less time, ultimately leading to improved patient care and reduced costs (Pelissier [0032]). Regarding claim 10, Rothberg teaches a method for enhancing a device operator's use of a medical device via feedback from a remote source during a medical procedure (“A method, comprising: […] generating, using the ultrasound image, a guidance plan for how to guide an operator of the ultrasound device to capture an ultrasound image of the subject containing the target anatomical view” [clm 20]; “techniques for guiding an operator to use an ultrasound device” [abst]; “The disclosure provides techniques for instructing an operator of an ultrasound device how to position the ultrasound device on a subject to capture a medically relevant ultrasound image. […] The guidance may be provided via a software application (hereinafter “App”) installed on a computing device of the operator (such as: a mobile device, a smartphone or smart-device, tablet, etc.)” [0144]; [0140-0180], [fig. 1, 5A-5B, 15A-15B, 9]), the method comprising: performing a medical procedure using the medical device (“the ultrasound device may generate ultrasound sound data and transmit (via a wired or wireless communication link) the ultrasound data to the computing device” [0148]; “The ultrasound device 102 may transmit ultrasound data to the computing device 104 using the communication link 112. The communication link 112 may be a wired (or wireless) communication link.” [0183]; “the ultrasound device may be initially positioned on a subject 201 at an initial position 202 (on a lower torso of the subject 201)” [0186]; [0181-0232], [fig. 1, 5A-5B, 9, 15A-15B], [see claim 1 rejection]); transmitting a video, via a shared application, of the medical procedure from a video-enabled device of the device operator to a remote device of the remote source, the shared application permitting sharing and viewing of the video (“the method further comprises displaying a plurality of composite images in real time. […] the method further comprises providing instructions in real time based on the plurality of composite images, wherein the instructions guide a user of the ultrasound probe in acquisition of subsequent ultrasound images of the portion of the patient's body” [0110]; “The guidance may be provided via a software application (hereinafter “App”) […] The operator may then position the ultrasound device on the subject and the software application (via the computing device) may provide feedback to the operator indicating whether the operator should reposition the ultrasound device and how he/she should proceed to do so.” [0144]; “the App may be executed on a cloud and communicated to the operator through the smart device. […] the execution of the App may be at a local or a remote device” [0146]; “Additionally (or alternatively), the processor 1510 may control the imaging device 1506 to capture non-acoustic images of the ultrasound device 1514 being used on a subject to provide an operator of the ultrasound device 1514 an augmented reality interface” [0310]; [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B], [see claim 1, 3 rejections]); but Rothberg may fail to explicitly teach the remote source control of an operation or attribute of the medical device via the shared application. However, in the same field of endeavor, Pelissier teaches a method for enhancing a device operator's use of a medical device via feedback from a remote source during a medical procedure (“A method for providing positional feedback to an operator performing an ultrasound scan of a patient” [clm 25]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11], [see claim 9 rejection]), the method comprising: performing a medical procedure using the medical device (“acquiring ultrasound image data from an probe at a first location;” [clm 25]; “System 100 comprises apparatus 100A at a location where a user performs an ultrasound scan on a patient and apparatus 100B at a remote location. An expert may use apparatus 100B to review information about an ultrasound procedure in real time as the ultrasound procedure is being performed at apparatus 100A and to provide real time feedback to a user of apparatus 100A.” [0061]; “In operation 201, ultrasound imaging data is acquired.” [0160]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11], [see claim 9 rejection]); transmitting a video, via a shared application, of the medical procedure from a video-enabled device of the device operator to a remote device of the remote source, the shared application permitting sharing and viewing of the video (“acquiring a video stream depicting the probe and the patient; transmitting the ultrasound image data and the video stream to a second location that is remote from the first location; displaying on a graphical interface at the second location the ultrasound image data, the video stream, and a plurality of graphical elements each indicating a positional correction;” [clm 25]; “In operation 202, a video stream depicting the probe is acquired. Operations 201 and 202 may occur simultaneously. Preferably, the video stream depicts the probe in relation to the patient.” [0161]; “the ultrasound imaging data and video stream are transmitted to a second location. The imaging data and video stream may be compressed before transmission to reduce bandwidth requirements.” [0162]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11; see fig. 2 reproduced below], [see claim 9 rejection]); and permitting the remote source control of an operation or attribute of the medical device via the shared application (“When remote feedback is desired apparatus 100A is caused to transmit one or both of the ultrasound imaging data and the video stream through a data communication network 106 to remote apparatus 100B.” [0070]; “patient imaging device 104 is adjustable (for example has adjustable pan and/or zoom) and remote interface device 118 includes controls that allow the expert to remotely adjust patient imaging device 104 to obtain a clear view of the procedure or a view that can best help the user” [0074]; “the remote expert may provide feedback to the user by way of remote user interface 118.” [0076]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11], [see claim 9 rejection]). PNG media_image5.png 1027 655 media_image5.png Greyscale Ultrasound data and video stream of ultrasound imaging device are acquired and transmitted to second location, wherein instructions are transmitted to local user interface based on analysis of both video and ultrasound data (Pelissier [fig. 2]) It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to combine the method for providing remote guidance taught by Rothberg by permitting control of at least one attribute of the medical device by the skilled resource as taught by Pelissier. It can be difficult to properly capture and analyse ultrasound images. The successful use of ultrasound is dependent on highly-skilled technicians to perform the scans and experienced physicians to interpret them. It typically takes several years of training for a technician to become proficient (Pelissier [0003]). Furthermore, holding the ultrasound device a few inches too high or too low on the subject may make the difference between capturing a medically relevant ultrasound image and capturing a medically irrelevant ultrasound image. As a result, non-expert operators of an ultrasound device may have considerable trouble capturing medically relevant ultrasound images of a subject (Rothberg [0004]). The technology improvements described may enable, among other capabilities, focused diagnosis, early detection and treatment of conditions by an ultrasound system (Rothberg [0209]), and may increase the effectiveness of remote feedback, thus helping inexperienced ultrasound users to capture accurate images in less time, ultimately leading to improved patient care and reduced costs (Pelissier [0032]). Regarding claim 11, Rothberg and Pelissier teach the method of Claim 10, Rothberg further comprising permitting the remote source to transmit feedback to the device operator (“determining whether the ultrasound image contains a target anatomical view; […] generating, using the ultrasound image, a guidance plan for how to guide an operator of the ultrasound device to capture an ultrasound image of the subject containing the target anatomical view;” [clm 20]; “The guidance may be provided via a software application (hereinafter “App”) […] The operator may then position the ultrasound device on the subject and the software application (via the computing device) may provide feedback to the operator indicating whether the operator should reposition the ultrasound device and how he/she should proceed to do so.” [0144]; “the App may be executed on a cloud and communicated to the operator through the smart device. […] the execution of the App may be at a local or a remote device” [0146]; [0181-0232, 0300-0315], [fig. 1, 5A-5B, 9, 15A-15B], [see claim 1 rejection]). Regarding claim 12, Rothberg and Pelissier teach the method of Claim 11, Rothberg further teaching further comprising conveying the feedback to the device operator via a communication interface (“the computing device may provide an instruction to reposition the ultrasound device to the operator. The instruction may be, for example, an audible instruction played through a speaker, a visual instruction displayed using a display, and/or a tactile instruction provided using a vibration device (e.g., integrated into the computing device and/or the ultrasound device).” [0230]; [0181-0232, 0300-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B; see fig. 15B reproduced below], [see claim 6 rejection]). Regarding claim 13, Rothberg and Pelissier teach the method of Claim 10, Rothberg further teaching wherein the permitting the remote source control comprises, in response to a signal from the remote source, generating haptic feedback to allow the remote source to communicate with the medical device operator (“The instruction may be, […] and/or a tactile instruction provided using a vibration device (e.g., integrated into the computing device and/or the ultrasound device).” [0230]; “the computing device may provide an indication of proper positioning. For example, the computing device may provide an audible confirmation […] or a tactile confirmation provided through a vibration device.” [0232]; “The computing device 1502 comprises an audio output device 1504, an imaging device 1506, a display screen 1508, a processor 1510, a memory 1512, and a vibration device 1509.” [0306]; The vibration device generates tactile feedback for the local user by vibrating to issue tactile instructions [0144-0153, 0171-0174, 0225-0244, 0257-0315], [fig. 1, 3A-3B, 5A-5B, 9-11, 15A-15B]); but Rothberg may fail to teach generating haptic feedback via the medical device – Applicant should note that Rothberg clearly teaches the use of tactile feedback using the computing device as discussed above. However, in the same field of endeavor, Pelissier teaches wherein the permitting the remote source control comprises, in response to a signal from the remote source, generating haptic feedback via the medical device to allow the remote source to communicate with the medical device operator (“If the monitored motion does not correspond to the message received from remote interface 118 then a warning signal may be provided to the user. The warning signal may, for example, comprise a sound, tactile feedback or the like. For example, probe 103 may be controlled to deliver a vibration or other haptic feedback to a user if the user moves the probe 103 in the wrong way.” [0112]; “the tactile feedback is provided by transducers such as piezoelectric transducers and/or heaters on a housing of probe 103 that may be selectively actuated to change the texture, temperature and/or vibration patterns sensed by a user holding probe 103 in a manner that maps intuitively to suggested movements.” [0116]; The probe may provide tactile cues to the user based on the suggestions/instructions provided by the remote located expert [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11]). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to combine the method for providing remote guidance taught by Rothberg by generating haptic feedback via the medical device as taught by Pelissier. The technology improvements described may enable, among other capabilities, focused diagnosis, early detection and treatment of conditions by an ultrasound system (Rothberg [0209]), and may increase the effectiveness of remote feedback, thus helping inexperienced ultrasound users to capture accurate images in less time, ultimately leading to improved patient care and reduced costs (Pelissier [0032]). Tactile feedback may be selectively actuated to change the texture and/or vibration patterns sensed by a user holding probe in a manner that maps intuitively to suggested movements (Pelissier [0112]). Regarding claim 14, Rothberg and Pelissier teach the method of Claim 13, Rothberg further teaching wherein the medical device contains an accelerometer (); but Rothberg fails to teach the remote source has a second medical device that contains an accelerometer. However, in the same field of endeavor, Pelissier teaches wherein the medical device contains an accelerometer (“providing an accelerometer in probe 103 and monitoring one or more outputs of the accelerometer to detect and/or measure inclination of and/or rotations of the probe 103;” [0110]; [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11]), the remote source has a second medical device that contains an accelerometer (“In some embodiments remote interface 118 includes a demonstration device which can be positioned like probe 103 and is equipped with a position sensor. The demonstration device may be a probe 103 or a replica of probe 103 for example.” [0076]; The demonstration device (i.e., second medical device) manipulated by remote expert may be the same type of probe used by the local user, wherein the demonstration device probe possesses an accelerometer as a position sensor [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11]), and the remote source can manipulate the second medical device such that when the medical device operator has oriented the medical device in the same way as the remote source, the medical device operator receives haptic feedback in their medical device (“The expert may manipulate the demonstration device to demonstrate a particular technique while the position and orientation of the demonstration device is tracked. The tracked position and orientation of the demonstration device may be transmitted to apparatus 100A which may generate an animation showing the user of apparatus 100A how the expert is manipulating the demonstration device.” [0076]; “the tactile feedback is provided by transducers such as piezoelectric transducers and/or heaters on a housing of probe 103 that may be selectively actuated to change the texture, temperature and/or vibration patterns sensed by a user holding probe 103 in a manner that maps intuitively to suggested movements.” [0116]; The tracked position and orientation of demonstration device may be transmitted to the apparatus for the local user, wherein the manipulations of the local user may be guided by tactile feedback [0061-0090, 0147-0168], [fig. 1-2, 7-8, 11], [see claim 13 rejection]). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to combine the method for providing remote guidance taught by Rothberg by generating haptic feedback via the medical device as taught by Pelissier. The technology improvements described may enable, among other capabilities, focused diagnosis, early detection and treatment of conditions by an ultrasound system (Rothberg [0209]), and may increase the effectiveness of remote feedback, thus helping inexperienced ultrasound users to capture accurate images in less time, ultimately leading to improved patient care and reduced costs (Pelissier [0032]). Tactile feedback may be selectively actuated to change the texture and/or vibration patterns sensed by a user holding probe in a manner that maps intuitively to suggested movements (Pelissier [0112]). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the invention to combine the method for providing remote guidance taught by Rothberg by generating haptic feedback via the medical device as taught by Pelissier. The successful use of ultrasound is dependent on highly-skilled technicians to perform the scans and experienced physicians to interpret them. It typically takes several years of training for a technician to become proficient (Pelissier [0003]). Furthermore, holding the ultrasound device a few inches too high or too low on the subject may make the difference between capturing a medically relevant ultrasound image and capturing a medically irrelevant ultrasound image. As a result, non-expert operators of an ultrasound device may have considerable trouble capturing medically relevant ultrasound images of a subject (Rothberg [0004]). The technology improvements described may enable, among other capabilities, focused diagnosis, early detection and treatment of conditions by an ultrasound system (Rothberg [0209]), and may increase the effectiveness of remote feedback, thus helping inexperienced ultrasound users to capture accurate images in less time, ultimately leading to improved patient care and reduced costs (Pelissier [0032]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. McMorrow et al. (US20050288584A1; 2005-12-29) teaches a system which includes at least one ultrasound data collection device to carry out a specific ultrasound procedure; resulting ultrasound data is transmitted via a local server at the site of the data collection device to a web database server to produce medical result information, including ultrasound images, which is then sent to a skilled technician, who accepts the results, rejects or edits them. Accepted medical results are transmitted back to the medical practitioner [abst]. Tupin et al. (US20100274145A1; 2010-10-28) teaches fetal and/or maternal monitoring devices, systems and methods using UWB medical radar [abst]. The system may provide one or more accounts for a patients, doctors, caregivers, etc. to access the patient data. This data may be sent directly to a physician or caregiver, or it may be accessed from a remote location by the physician/caregiver, and may be configured to send alerts to a physician/caregiver or other based on the indicator of fetal/maternal health [0048]. Chutani et al. (US20110245632A1; 2011-10-06) teaches a method comprises obtaining, by a computer, patient information associated with a patient and operator information associated with one or more biometric sensors [abst]. A Computer 110 may be located remotely from biometric sensor 102 and computer 104. Computer 110 may be operated by a remote user to provide instructions which are transmitted to computer 104 [0041]. Holmes et al. (US20130079599A1; 2013-03-28) teaches a method for diagnosing or treating a subject with the aid of a point of service system including displaying the three-dimensional representation to a healthcare provider in remote communication with the subject, with the aid of a computer system comprising a processor, wherein the system is communicatively coupled to the three-dimensional imaging device; and (d) diagnosing or treating the subject with the aid of the displayed three-dimensional representation of the subject [0171]. Abraham (US 20140039277 A1; 2014-02-06) teaches a remote assessment system for field use in which ultrasound imaging, in combination with other known diagnostic medical equipment for field/emergency use, may be utilized by unskilled personnel in the field, for example, remote from a hospital, under remote guidance of the hospital to assess, diagnose and alleviate field medical injury [0002]. Tran (US20140266787A1; 2014-09-18) teaches a wireless system for a person includes a wearable appliance with an accelerometer; a wireless device in communication with the wearable appliance; and a remote computer coupled to the wireless device to provide information to an authorized remote user [abst]. Stolka et al. (US20160119529A1; 2016-04-28) teaches a system for guidance of an imaging device may include a handheld imaging device, a multidirectional feedback device, and a control unit in communication with the multidirectional feedback device and the handheld imaging device. The control unit may be configured to receive a target location, determine an initial position and pose of the handheld imaging device, calculate a position and pose deviation relative to said initial position and pose, translate said position and pose deviation into control data, and transmit said control data to the multidirectional feedback device, wherein the multidirectional feedback device uses control data to provide an operator with feedback to guide the handheld imaging device towards the target [abst]. Any inquiry concerning this communication or earlier communications from the examiner should be directed to James F. McDonald III whose telephone number is (571)272-7296. The examiner can normally be reached M-F; 8AM-6PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Koharski can be reached at 5712727230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JAMES FRANKLIN MCDONALD III Examiner Art Unit 3797 /SHAHDEEP MOHAMMED/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

May 03, 2024
Application Filed
Jan 16, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588809
Systems and Methods for Determining Tissue Inflammation Levels of the Eye from Blood Vessel Characteristics
2y 5m to grant Granted Mar 31, 2026
Patent 12582378
METHODS AND SYSTEMS FOR AN INVASIVE DEPLOYABLE DEVICE USING A SHAPE MEMORY MATERIAL TO RECONFIGURE TRANSDUCER ELEMENTS IN RESPONSE TO STIMULI
2y 5m to grant Granted Mar 24, 2026
Patent 12564388
Phase Change Insert for Ultrasound Imaging Probe
2y 5m to grant Granted Mar 03, 2026
Patent 12544003
SYSTEM, METHOD, AND APPARATUS FOR TEMPERATURE ASYMMETRY MEASUREMENT OF BODY PARTS
2y 5m to grant Granted Feb 10, 2026
Patent 12527542
ULTRASOUND IMAGING APPARATUS FOR BIPLANE IMAGING AND CONTROL METHOD THEREOF
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
55%
Grant Probability
99%
With Interview (+44.3%)
3y 6m
Median Time to Grant
Low
PTA Risk
Based on 76 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month