DETAILED ACTION
This office action is in response to the communication received on 10/06/2025 concerning application no. 16/405,982 filed on 05/07/2019.
Claims 1-2, 5-9, 11-12, 14-15, and 17-23 are pending.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 10/06/2025 have been fully considered but they are not persuasive.
Regarding the rejection under Pelissier in view of Torp further in view of Choe, Applicant argues that Torp does not teach automatic entry into the navigation mode and without user intervention. Applicant further argues that the scanner enters into the scanning mode via a user input and that is not taught by the art. Applicant argues that Torp would not be integrated with Pelissier.
Examiner disagrees. Mere allegation that Torp and Pelissier would not be integrated together is without support and is unpersuasive. “If a prima facie case of obviousness is established, the burden shifts to the applicant to come forward with arguments and/or evidence to rebut the prima facie case. See, e.g., In re Dillon, 919 F.2d 688, 692, 16 USPQ2d 1897, 1901 (Fed. Cir. 1990) (en banc). Rebuttal evidence and arguments can be presented in the specification, In re Soni, 54 F.3d 746, 750, 34 USPQ2d 1684, 1687 (Fed. Cir. 1995), by counsel, In re Chu, 66 F.3d 292, 299, 36 USPQ2d 1089, 1094-95 (Fed. Cir. 1995), or by way of an affidavit or declaration under 37 CFR 1.132, e.g., Soni, 54 F.3d at 750, 34 USPQ2d at 1687; In re Piasecki, 745 F.2d 1468, 1474, 223 USPQ 785, 789-90 (Fed. Cir. 1984). However, arguments of counsel cannot take the place of factually supported objective evidence. See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir. 1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).” In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Applicant misreads the rejection. The teaching of scanning prompting is disclosed by Pelissier and not Torp. Additionally, with respect to the automation, Applicant fails to address the fact that Pelissier also teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention. Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. As noted in the prior action filed, 07/07/2025, “In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). In response to applicant's argument that Torp cannot be integrated into Pelissier, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). Both references discuss an automated digital handshake that is conducted via computer systems. It is well-known digital connections are not requiring of human intervention. Mere declaration that the amendments are distinct and the prior argument are not relevant is insufficient. As stated in the prior action, filed 09/05/2024, “Applicant is reminded that the Pelissier reference begins with the title “System and Method for Connecting and Controlling Wireless Ultrasound Imaging System from Electronic device” (Emphasis added). Pelissier then continues in paragraph 0008 to state “a wireless ultrasound imaging system comprises a multi-use display device and an ultrasound imaging device”. Even the protocol communications are stated to be wireless in paragraph 0009 of Pelissier. It should be further noted that Pelissier clearly recognized “need for wireless ultrasound imaging systems that enable users of multi-use display devices to connect to ultrasound imaging devices quickly, easily, and securely” (emphasis added) as that is what the reference teaches in paragraph 0006. Even the claims of Pelissier state that that the communication between the devices and the interface are wireless communications in at least independent claim 1. In addition to the overwhelming evidence above, Fig. 5 of Pelissier clearly shows that the connection between the imaging apparatus and the display device is Bluetooth/Wifi based connections. It is well-known knowledge that these modes of connections are wireless.1 This is directly in line with Applicant’s own specification. Applicant own specification, in paragraph 0029 states that “The communications module 40 may wirelessly transmit signals to and receives signals from the interface 26. The protocol used for communications between the scanner 24 and the interface 26 may be WiFi™ or Bluetooth™, for example, or any other suitable two-way radio communications protocol. The scanner 24 may operate as a WiFi™ hotspot, for example.” (emphasis added). Assuming, arguendo, the claims, the title, the disclosure, and the figures of Pelissier were silent, Torp would still resolve any deficiencies regarding wireless communication. Torp, in paragraph 0022 describes Fig. 1. In its description, Torp explicitly states “The scan system 101 may be physically connected to the probe 106, or the scan system 101 may be in communication with the probe 106 via a wireless communication technique.” (Emphasis added). Applicant’s allegation that the steps occur without human intervention are further unpersuasive. Applicant was informed in prior actions, filed 06/09/2021, 01/14/2022, 10/04/2022, and 04/25/2023, that Pelissier teaches this element. Paragraph 0083 teaches that the selection of an ultrasound imaging device for pairing may be automated. Paragraph 0086 teaches that the processor can automate all of the selection. Pelissier acknowledges the benefit of automation in paragraph 0124 where it is stated that it “may reduce the time required to start scanning and make the workflow easier.” Furthermore, Applicant is reminded that the claims are towards a wireless communication between a scanner and an interface. Given that both are electromechanically devices that operate off of protocols as discussed throughout Pelissier, it is self-evident that human intervention is not present. Again, as mentioned above, Fig. 5 shows Bluetooth and Wi-Fi being used in communication.” Regarding the comment that the communication does not require gestures, buttons, or motion is not persuasive. The claims establish that that scanner and the interface “automatically and without user intervention wirelessly sends a signal”. It is extremely well-known that a person does not engage and is physically, mentally, and biochemically unable to wirelessly sends a signal to an interface. As noted in both Pelissier and Torp, the connection is wireless. Such a connection would not utilize a human. It is inherent. In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., default navigation, priming in navigation mode, the cited feature of Fig. 3, the preclusion of driving and customizing motion patterns) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). The use of the MUDD in Pelissier is relevant as Applicant’s very own claims state that the connection is performed automatically and wirelessly.” Furthermore, the arguments that Torp uses gestures is unpersuasive as such a control is consistent with Applicant’s very own claims and specification. In at least claims 6-7, the claims establish that the modes are entered via scanning motions and gestures. The specification of Applicant in at least paragraph 0044 states “ In step 120, the scanner 24 may enter the scan mode, for example as a result of the operator of the scanner 24 pressing a button 30 on the scanner 24 (as shown in FIG. 2) or making a gesture with the scanner 24, or otherwise selecting an option highlighted on the screen 28 that leads to a change in mode of the scanner 24 and interface 26 from the navigation mode to the scan mode. In response to the scanner 24 entering the scan mode, the interface 26 may correspondingly operate in the scan mode in step 122”. It is unclear how Applicant is establishes simultaneously that there is no gestures used in the system at all and also claiming the very gesturing that is performed by an operator. Applicant’s arguments are conclusory and without support… Finally, it is unclear how Applicant is misinterpreting Choe. In the very paragraph 0036 of Choe that Applicant cites, it clearly states “The handheld ultrasound device can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device 110, such as a tablet, smartphone, smartwatch, smartglasses, or other portable handheld electronic device” (emphasis added). Even in the rejection, it is stated “Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses” (emphasis added). Applicant is encouraged to read the references and the rejections in their entirety. Assuming, arguendo, that the references were silent, the idea of a digital handshake and using a probe to both perform ultrasound imaging and interface control is extremely well-known. See at least Poland (EP 3324849), Hamlin et al. (WO 2016087984), Poland (WO 2017013511), Lee et al. (US Patent No. 11,571,183), Gu et al. (US Patent No. 11,154,275), and Bell et al. (PGPUB No. US 2018/0263600).”
Examiner maintains the rejection.
Applicant's arguments filed 10/06/2025 have been fully considered but they are not persuasive.
Regarding the rejection under Pelissier in view of Wilcox further in view of Choe, Applicant argues that the operability is not automatic and without user intervention. Applicant argues that Wilcox does not teach it and instead solely relies on movement via a user.
Examiner disagrees. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). MPEP 2145 establishes “If a prima facie case of obviousness is established, the burden shifts to the applicant to come forward with arguments and/or evidence to rebut the prima facie case. See, e.g., In re Dillon, 919 F.2d 688, 692, 16 USPQ2d 1897, 1901 (Fed. Cir. 1990) (en banc). Rebuttal evidence and arguments can be presented in the specification, In re Soni, 54 F.3d 746, 750, 34 USPQ2d 1684, 1687 (Fed. Cir. 1995), by counsel, In re Chu, 66 F.3d 292, 299, 36 USPQ2d 1089, 1094-95 (Fed. Cir. 1995), or by way of an affidavit or declaration under 37 CFR 1.132, e.g., Soni, 54 F.3d at 750, 34 USPQ2d at 1687; In re Piasecki, 745 F.2d 1468, 1474, 223 USPQ 785, 789-90 (Fed. Cir. 1984). However, arguments of counsel cannot take the place of factually supported objective evidence. See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir. 1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).” Additionally, with respect to the automation, Applicant fails to address the fact that Pelissier also teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention. Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. With respect to Wilcox, Applicant does not address that paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Communication of the devices that operate wirelessly do not require the use of user intervention. With respect to the navigation, it is consistent with Applicant’s own specification as that establishes the forms of navigation as seen in Fig. 4. As noted in the prior action, filed 07/07/2025, “Assuming, arguendo, that the references were silent, the idea of a digital handshake and using a probe to both perform ultrasound imaging and interface control is extremely well-known. See at least Poland (EP 3324849), Hamlin et al. (WO 2016087984), Poland (WO 2017013511), Lee et al. (US Patent No. 11,571,183), Gu et al. (US Patent No. 11,154,275), and Bell et al. (PGPUB No. US 2018/0263600).”
Examiner maintains the rejection.
Applicant's arguments filed 10/06/2025 have been fully considered but they are not persuasive.
Regarding the rejection under Pelissier in view of Poland, Applicant argues that the operability is not automatic and without user intervention. Applicant argues that Poland does not teach it and instead solely relies on movement via a user.
Examiner disagrees. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). MPEP 2145 establishes “If a prima facie case of obviousness is established, the burden shifts to the applicant to come forward with arguments and/or evidence to rebut the prima facie case. See, e.g., In re Dillon, 919 F.2d 688, 692, 16 USPQ2d 1897, 1901 (Fed. Cir. 1990) (en banc). Rebuttal evidence and arguments can be presented in the specification, In re Soni, 54 F.3d 746, 750, 34 USPQ2d 1684, 1687 (Fed. Cir. 1995), by counsel, In re Chu, 66 F.3d 292, 299, 36 USPQ2d 1089, 1094-95 (Fed. Cir. 1995), or by way of an affidavit or declaration under 37 CFR 1.132, e.g., Soni, 54 F.3d at 750, 34 USPQ2d at 1687; In re Piasecki, 745 F.2d 1468, 1474, 223 USPQ 785, 789-90 (Fed. Cir. 1984). However, arguments of counsel cannot take the place of factually supported objective evidence. See, e.g., In re Huang, 100 F.3d 135, 139-40, 40 USPQ2d 1685, 1689 (Fed. Cir. 1996); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984).” Additionally, with respect to the automation, Applicant fails to address the fact that Pelissier also teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention. Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. Nothing in the cited sections states that the probes switching is according to a user. Furthermore, assuming arguendo, it had a user, it would be consistent with Applicant’s own specification as that establishes the forms of navigation as seen in Fig. 4. Applicant further fails to address the fact that Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol. As noted in the prior action, filed 07/07/2025, “Assuming, arguendo, that the references were silent, the idea of a digital handshake and using a probe to both perform ultrasound imaging and interface control is extremely well-known. See at least Poland (EP 3324849), Hamlin et al. (WO 2016087984), Poland (WO 2017013511), Lee et al. (US Patent No. 11,571,183), Gu et al. (US Patent No. 11,154,275), and Bell et al. (PGPUB No. US 2018/0263600).”
Examiner maintains the rejection.
Claim Interpretation
Claim 14 contains an optional limitation “when in scanning mode”. According to MPEP 2111.04, “The broadest reasonable interpretation of a method (or process) claim having contingent limitation acquires only those steps that must be performed and does not include steps that are not required to be performed because the condition(s) precedent are not met”.
Therefore, the claim element starting with “receiving from the ultrasound scanner, a command to” and ending with “and causing the screen to alter the displayed ultrasound media according to the command” is not required to be performed.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1-2, 5-9, 11-12, 14-15, and 17-23 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Claim 1 recites “based on a user input, the ultrasound scanner enters scanning mode”. Paragraph 0044 of the published specification teaches entry into the scanner via “a result of the operator of the scanner 24 pressing a button 30 on the scanner 24 (as shown in FIG. 2) or making a gesture with the scanner 24, or otherwise selecting an option highlighted on the screen 28”. The term “based on a user input, the ultrasound scanner enters scanning mode” can broadly encompass other undisclosed forms of user input. However, the specification fails to provide support to such inputs such as audio input, use of a touch screen, or remote operability. That is, the claims via the recitation “based on a user input, the ultrasound scanner enters scanning mode” broadly encompass undisclosed modes by the specification. Therefore, the claim’s scope is not supported by the specification as it is claiming “based on a user input, the ultrasound scanner enters scanning mode” that are not established in the specification at the time of filing.
Claim 11 recites “based on a user input, the ultrasound scanner is directed to enter scanning mode”. Paragraph 0044 of the published specification teaches entry into the scanner via “a result of the operator of the scanner 24 pressing a button 30 on the scanner 24 (as shown in FIG. 2) or making a gesture with the scanner 24, or otherwise selecting an option highlighted on the screen 28”. The term “based on a user input, the ultrasound scanner enters scanning mode” can broadly encompass other undisclosed forms of user input. However, the specification fails to provide support to such inputs such as audio input, use of a touch screen, or remote operability. That is, the claims via the recitation “based on a user input, the ultrasound scanner is directed to enter scanning mode” broadly encompass undisclosed modes by the specification. Therefore, the claim’s scope is not supported by the specification as it is claiming “based on a user input, the ultrasound scanner is directed to enter scanning mode” that are not established in the specification at the time of filing.
Claim 15 recites “based on a user input, enters scanning mode”. Paragraph 0044 of the published specification teaches entry into the scanner via “a result of the operator of the scanner 24 pressing a button 30 on the scanner 24 (as shown in FIG. 2) or making a gesture with the scanner 24, or otherwise selecting an option highlighted on the screen 28”. The term “based on a user input, the ultrasound scanner enters scanning mode” can broadly encompass other undisclosed forms of user input. However, the specification fails to provide support to such inputs such as audio input, use of a touch screen, or remote operability. That is, the claims via the recitation “based on a user input, enters scanning mode” broadly encompass undisclosed modes by the specification. Therefore, the claim’s scope is not supported by the specification as it is claiming “based on a user input, enters scanning mode” that are not established in the specification at the time of filing.
Claims that are not discussed above but are cited to be rejected under 35 U.S.C. 112(a) are also rejected because they inherit the deficiencies of the claims they respectively depend upon.
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-2, 5-9, 11-12, 14-15, and 17-23 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 is indefinite for the following reasons:
Recites “wireless communications link”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art if the “communications link” is the same as the “wireless communication link” established in preceding claim elements or is a separate and distinct feature.
Applicant is encouraged to provide consistent and clear language.
Recites “based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art upon what the user input is placed. The claims are directed to multiple computational devices. The claim fails to disclose where the user input is placed for scanning mode entry.
Applicant is encouraged to provide consistent and clear language.
Claim 9 is indefinite for the following reasons:
Recites “communication link”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art if the “communication link” is the same as the “wireless communication link” established in claim 1 or is a separate and distinct feature.
Applicant is encouraged to provide consistent and clear language throughout the claims.
Claim 11 is indefinite for the following reasons:
Recites “wireless communications link”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art if the “communications link” is the same as the “wireless communication link” established in preceding claim elements or is a separate and distinct feature.
Applicant is encouraged to provide consistent and clear language.
Recites “based on a user input, the ultrasound scanner is directed to enter scanning mode”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art upon what the user input is placed. The claims are directed to multiple computational devices. The claim fails to disclose where the user input is placed for scanning mode entry.
Applicant is encouraged to provide consistent and clear language.
Claim 15 is indefinite for the following reasons:
Recites “wireless communications link”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art if the “communications link” is the same as the “wireless communication link” established in preceding claim elements or is a separate and distinct feature.
Applicant is encouraged to provide consistent and clear language.
Recites “based on a user input, enters scanning mode”. This claim element is indefinite. It would be unclear to one with ordinary skill in the art upon what the user input is placed. The claims are directed to multiple computational devices. The claim fails to disclose where the user input is placed for scanning mode entry.
Applicant is encouraged to provide consistent and clear language.
Claims that are not discussed above but are cited to be rejected under 35 U.S.C. 112(b) are also rejected because they inherit the indefiniteness of the claims they respectively depend upon.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 5-9, 12, 14-15, 17-19, and 21-23 are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Torp et al. (PGPUB No. US 2014/0187950) further in view of Choe et al. (PGPUB No. US 2018/0271482).
Regarding claim 1, Pelissier teaches a method for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the ultrasound scanner broadcasts an identification signal, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.2 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the ultrasound scanner detects a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the ultrasound scanner establishes the wireless communication link with the interface (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 teaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device);
based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data);
while in scanning mode, sends data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method,
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link;
the ultrasound scanner, operable in a navigation mode and a scanning mode, and upon establishment of the wireless communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, the ultrasound scanner, operable in a navigation mode and a scanning mode, and upon establishment of the wireless communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode (Paragraph 0023 teaches that the connection of the display device to the probe may be wireless. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. Paragraph 0041 teaches that the specific modes may be related to specific gestures or switch toggles. Such gestures are able to control the position of a cursor on a screen. Such gestures are performed after the connection between the probe and interface are established as the commands on the probe control the interface. Such commands would not be possible if the devices where not connected. Paragraph 0050 teaches that the probe 106 is able to detect the position of the scanning mode via the motion sensing system 107. Paragraph 0048 teaches the gesturing controls the cursor, the selection of the icons, pointing, and the interaction with the GUI. Abstract and claims 1, 4-9, 12-16, and 20 teach the use, collection, manipulation, or displaying of ultrasound images. Paragraph 0021 teaches that the device is able to perform scanning or control the input of the patient data, and change the parameters. Paragraph 0041 teaches the input controls. Abstract teaches the motion of the probe is detected for patterns. Paragraph 0040 teaches the patterns can be associated to the gestures).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified method would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
However, Torp is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Torp with Choe’s teaching of use of a smartwatch screen. This modified method would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 2, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, comprising the ultrasound scanner operating as a mobile communications network hotspot or using a short-range communication protocol to broadcast the identification signal and establish the wireless communication link (Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi).
Regarding claim 5, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner by detecting an activation of a button on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner by detecting an activation of a button on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6)
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detecting image mode based on a button activation. This modified method would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). Furthermore, the utilization of buttons allows for easy control inputs.
Regarding claim 6, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner detecting a gesture of the ultrasound scanner (Paragraph 0041 teaches that a particular mode can be activated based on a particular gesture performed with the probe).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detecting an ultrasound scanning gesture. This modified method would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). Furthermore, the use of a gesture with the scanner reduces the need for the user to have to interact with a separate input interface for manipulation. The position and orientation is done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Regarding claim 7, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein the ultrasound scanner enters scanning mode by, the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner (Paragraph 0041 and 0048 teaches that the back-and-forth motion performed with the probe can be used in the selection of a mode).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detecting the ultrasound scanning motion of the ultrasound scanner. This modified method would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Regarding claim 8, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, additionally comprising a step of receiving ultrasound input at the ultrasound scanner, in scanning mode, to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals).
Regarding claim 9, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, further comprising:
after the ultrasound scanner broadcasts the identification signal, the interface detecting the identification signal broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. See modified Fig. 5 above), and
requesting to establish the communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface participating, in the establishing of the communication link (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface receiving, the data from the ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. This is the ultrasound imaging);
the interface receiving, the parameter (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device);
the interface using, the parameter to convert the data into the ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on these parameters by the ultrasound device and the information is transmitted back to the display device).
Regarding claim 11, Pelissier teaches a method, method for establishing for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the interface listening for an identification signal broadcast from the ultrasound scanner, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.3 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the interface detecting the identification signal, and requesting to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface establishing the wireless communication link to the ultrasound scanner (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 teaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device), and
based on a user input, the ultrasound scanner is directed to enter scanning mode, and the interface wirelessly receives from the ultrasound scanner, a second mode signal to enter scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data);
the interface, while in scanning mode, receiving data from an ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102);
the interface receiving from the scanner, a parameter (Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device);
the interface using the parameter to convert the data into an ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used.); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method,
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link;
wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode (Paragraph 0023 teaches that the connection of the display device to the probe may be wireless. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. Paragraph 0041 teaches that the specific modes may be related to specific gestures or switch toggles. Such gestures are able to control the position of a cursor on a screen. Such gestures are performed after the connection between the probe and interface are established as the commands on the probe control the interface. Such commands would not be possible if the devices where not connected. Paragraph 0050 teaches that the probe 106 is able to detect the position of the scanning mode via the motion sensing system 107. Paragraph 0048 teaches the gesturing controls the cursor, the selection of the icons, pointing, and the interaction with the GUI. Abstract and claims 1, 4-9, 12-16, and 20 teach the use, collection, manipulation, or displaying of ultrasound images. Paragraph 0021 teaches that the device is able to perform scanning or control the input of the patient data, and change the parameters. Paragraph 0041 teaches the input controls. Abstract teaches the motion of the probe is detected for patterns. Paragraph 0040 teaches the patterns can be associated to the gestures).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified method would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
However, Torp is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Torp with Choe’s teaching of use of a smartwatch screen. This modified method would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 14, modified Pelissier teaches the method in claim 11, as discussed above.
Pelissier further teaches a method, comprising:
the interface, when in scanning mode, receiving from the ultrasound scanner, a command to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals); and
causing the screen to alter the displayed ultrasound media according to the command (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Fig. 5 shows that the image configuration information is sent in S538, before the ultrasound data and display steps occur in S548 and S550).
Regarding claim 15, Pelissier teaches an ultrasound scanner that establishes a wireless communication link with an interface, comprising:
a processor (Processor 140); and
computer readable memory (Memory 144) storing computer readable instructions, which, when executed by the processor cause the ultrasound scanner, without human intervention, and after the ultrasound scanner is switched on (Paragraph 0124 teaches that the ultrasound imaging device selection and connection is done automatically. Abstract teaches that the communication between the devices occurs when the ultrasound device is in standby state. Paragraph 0083 teaches that the selection of an ultrasound imaging device for pairing may be automated. Paragraph 0086 teaches that the processor can automate all of the selection. Paragraph 0124 teaches that such automation reduces time to start scanning and makes workflow easier), to:
broadcast an identification signal, the interface, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.4 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
detect a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragr.aph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
establish the wireless communication link with the interface (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device);
based on a user input, enters scanning mode; send, to the interface, a second mode signal directing the interface to operate in scanning mode, data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding an ultrasound scanner, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner,
upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode (Paragraph 0023 teaches that the connection of the display device to the probe may be wireless. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human. Paragraph 0041 teaches that the specific modes may be related to specific gestures or switch toggles. Such gestures are able to control the position of a cursor on a screen. Such gestures are performed after the connection between the probe and interface are established as the commands on the probe control the interface. Such commands would not be possible if the devices where not connected. Paragraph 0050 teaches that the probe 106 is able to detect the position of the scanning mode via the motion sensing system 107. Paragraph 0048 teaches the gesturing controls the cursor, the selection of the icons, pointing, and the interaction with the GUI. Abstract and claims 1, 4-9, 12-16, and 20 teach the use, collection, manipulation, or displaying of ultrasound images. Paragraph 0021 teaches that the device is able to perform scanning or control the input of the patient data, and change the parameters. Paragraph 0041 teaches the input controls. Abstract teaches the motion of the probe is detected for patterns. Paragraph 0040 teaches the patterns can be associated to the gestures).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
However, Torp is silent regarding an ultrasound scanner, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches an ultrasound scanner, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Torp with Choe’s teaching of use of a smartwatch screen. This modified apparatus would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 17, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, which enters scanning mode by detecting activation of a bouton on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, which enters scanning mode by detecting activation of a bouton on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6)
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detecting image mode based on a button activation. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Regarding claim 18, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner (Motion sensing system 107 which includes one or more of the following sensors: a gyro sensor, an accelerometer, and a magnetic sensor. Paragraph 0041 teaches that a particular mode can be activated based on a particular gesture performed with the probe).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detecting an ultrasound scanning gesture. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Regarding claim 19, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner (Motion sensing system 107 which includes one or more of the following sensors: a gyro sensor, an accelerometer, and a magnetic sensor. Paragraph 0050 teaches that the motion sensing system 107 in the probe is collecting position information and is used in the reconstruction of data volumes during a scanning mode. Paragraph 0041 teaches that specific gestures such as back and forth motions can be used to activate specific modes).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of detection of a scanning mode based on a scanning motion. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Regarding claim 21, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Paragraph 0021 teaches that the device is able to perform scanning or control the input of the patient data, and change the parameters. Paragraph 0048 teaches the gesturing controls the cursor, the selection of the icons, pointing, and the interaction with the GUI. Paragraph 0047 teaches that the GUI has icons that controls parameters, functions, and features).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of inputting in a navigation mode. This modified method would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation is done with enhanced accuracy and precision (Paragraph 0038).
Regarding claim 22, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, wherein, in scanning mode, the parameter is data selected from the group consisting of imaging parameters, beamforming parameters and configuration settings (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used).
Regarding claim 23, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, wherein in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Paragraph 0021 teaches that the device is able to perform scanning or control the input of the patient data, and change the parameters. Paragraph 0048 teaches the gesturing controls the cursor, the selection of the icons, pointing, and the interaction with the GUI. Paragraph 0047 teaches that the GUI has icons that controls parameters, functions, and features).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Torp’s teaching of inputting in a navigation mode. This modified apparatus would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation is done with enhanced accuracy and precision (Paragraph 0038).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Torp et al. (PGPUB No. US 2014/0187950) further in view of Choe et al. (PGPUB No. US 2018/0271482) further in view of Poland (PGPUB No. US 2015/0245816).
Regarding claim 12, modified Pelissier teaches the method in claim 11, as discussed above.
However, the combination of Pelissier, Torp, and Choe is silent regarding a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen (Paragraphs 0043 and 0043 teaches that the docking unit 16 is automatically booted from a hibernated state upon connection with the mobile device 18. This allows for the use of the device 18 hardware including the display 26 used to display the ultrasound imaging information from the probe. Paragraph 0015 teaches the hibernation boot up allows for a rapid reboot. Paragraph 0026 teaches the mobile display device becomes dedicated to the ultrasound scanner function when docked).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Torp with Poland’s teaching of a screen wakes up upon information reception with an ultrasound system. This modified method would provide the user with improved ultrasound techniques in terms of costs, portability and multipurpose functionality (Paragraph 0008 of Poland).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Torp et al. (PGPUB No. US 2014/0187950) further in view of Choe et al. (PGPUB No. US 2018/0271482) further in view of Nefos (PGPUB No. US 2005/0228281).
Regarding claim 20, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, the combination of Pelissier, Torp, and Choe is silent regarding an ultrasound scanner, comprising a button, which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
Adjusts a depth of the ultrasound media displaying on the screen;
Adjusts a gain of the ultrasound media displaying on the screen; or freezes an image of the ultrasound media displaying on the screen.
In an analogous imaging field of endeavor, regarding the transmission and control of ultrasound systems, Nefos teaches an ultrasound scanner, comprising a button (Keypad 20), which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
adjusts a depth of the ultrasound media displayed on the screen (Paragraph 0125 teaches that the key 34 is used to determine the depth for the Doppler mode to be set on);
adjusts a gain of the ultrasound media displayed on the screen (Paragraph 0122 teaches that the required gain can be selected with the keys 33 and 35); or
freezes an image of the ultrasound media displayed on the screen (Paragraph 0048 teaches that the probe contains a freeze button so any displayed image can be held).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier, Torp, and Choe with Nefos’s teaching of adjusting gain and depth and freezing an image with button input. This modified apparatus would provide the user with the ability to study an image with closer inspection or measurement (Paragraph 0042 of Nefos).
Claims 1-2, 6-9, 11, 14-15, 18-19, and 21-23 are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Wilcox et al. (PGPUB No. US 2007/0078340) further in view of Choe et al. (PGPUB No. US 2018/0271482).
Regarding claim 1, Pelissier teaches a method for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the ultrasound scanner broadcasts an identification signal, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.5 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the ultrasound scanner detects a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the ultrasound scanner establishes the wireless communication link with the interface (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 teaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device);
based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data); and
while in scanning mode, sends data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link;
the ultrasound scanner, operable in a navigation mode and a scanning mode, and upon establishment of the wireless communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, the ultrasound scanner, operable in a navigation mode and a scanning mode, and upon establishment of the wireless communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0043 teaches the control of the display via the workstation, beamformer, processor, scan convertor and the CPU);
based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command); and,
while in scanning mode, sends data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified method would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
However, Wilcox is silent regarding a method, the interface, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches a method, the interface, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Wilcox with Choe’s teaching of use of a smartwatch screen. This modified method would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 2, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, comprising the ultrasound scanner operating as a mobile communications network hotspot or using a short-range communication protocol to broadcast the identification signal and establish the wireless communication link (Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi).
Regarding claim 6, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting a gesture of the ultrasound scanner (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of detecting an ultrasound scanning gesture. This modified method would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Regarding claim 7, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display. The capture of the image is done with a curve as noted in paragraph 0066. See Fig. 5).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of detecting the ultrasound scanning motion of the ultrasound scanner. This modified method would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Regarding claim 8, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, additionally comprising a step of receiving user input at the ultrasound scanner, in scanning mode, to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals).
Regarding claim 9, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, further comprising:
after the ultrasound scanner broadcasts the identification signal, the interface detecting the identification signal broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. See modified Fig. 5 above), and
requesting to establish the communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface participating, in the establishing of the communication link (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface receiving, the data from the ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. This is the ultrasound imaging);
the interface receiving, the parameter (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device);
the interface using, the parameter to convert the data into the ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on these parameters by the ultrasound device and the information is transmitted back to the display device).
Regarding claim 11, Pelissier teaches a method, method for establishing for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the interface listening for an identification signal broadcast from the ultrasound scanner, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.6 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the interface detecting the identification signal, and requesting to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface establishing the wireless communication link to the ultrasound scanner (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 t4eaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device), and
based on a user input, the ultrasound scanner is directed to enter scanning mode, and the interface wirelessly receives from the ultrasound scanner, a second mode signal to enter scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data);
the interface, while in scanning mode, receiving data from an ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102);
the interface receiving from the scanner, a parameter (Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device);
the interface using the parameter to convert the data into an ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used.); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link;
wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0043 teaches the control of the display via the workstation, beamformer, processor, scan convertor and the CPU);
based on a user input, the ultrasound scanner is directed to enter scanning mode, and the interface wirelessly receives from the ultrasound scanner, a second mode signal to enter scanning mode (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command);
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified method would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
However, Wilcox is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Wilcox with Choe’s teaching of use of a smartwatch screen. This modified method would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 14, modified Pelissier teaches the method in claim 11, as discussed above.
Pelissier further teaches a method, comprising:
the interface, when in scanning mode, receiving from the ultrasound scanner, a command to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals); and
causing the screen to alter the displayed ultrasound media according to the command (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Fig. 5 shows that the image configuration information is sent in S538, before the ultrasound data and display steps occur in S548 and S550).
Regarding claim 15, Pelissier teaches an ultrasound scanner that establishes a wireless communication link with an interface, comprising:
a processor (Processor 140); and
computer readable memory (Memory 144) storing computer readable instructions, which, when executed by the processor cause the ultrasound scanner, without human intervention, and after the ultrasound scanner is switched on (Paragraph 0124 teaches that the ultrasound imaging device selection and connection is done automatically. Abstract teaches that the communication between the devices occurs when the ultrasound device is in standby state. Paragraph 0083 teaches that the selection of an ultrasound imaging device for pairing may be automated. Paragraph 0086 teaches that the processor can automate all of the selection. Paragraph 0124 teaches that such automation reduces time to start scanning and makes workflow easier), to:
broadcast an identification signal, the interface, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.7 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
detect a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
establish the wireless communication link with the interface (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
thereafter the interface automatically signals to the screen to display a navigation user interface (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device);
based on a user input, enters scanning mode; send, to the interface, a second mode signal directing the interface to operate in scanning mode, data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding an ultrasound scanner, broadcast an identification signal, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner;
upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method,
upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command);
and thereafter the interface automatically signals to the screen to display a navigation user interface (Paragraph 0043 teaches the control of the display via the workstation, beamformer, processor, scan convertor and the CPU);
based on a user input, enters scanning mode; send, to the interface, a second mode signal directing the interface to operate in scanning mode, data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified apparatus would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
However, Wilcox is silent regarding a method, broadcast an identification signal, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Choe teaches a method, broadcast an identification signal, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0036 teaches that the ultrasound device can communicate results to the can communicate the results of an ultrasound measurement via a communication channel to a portable electronic device such as a smartwatch or smart-glasses. The communication channel can be a wireless communication channel that can be Bluetooth, other short distance wireless communication, Wi-Fi communication, or any other wireless communication known to one having skill in the art. It is inherent that a computational system will utilize a processor and memory for the performance of its computational functions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Wilcox with Choe’s teaching of use of a smartwatch screen. This modified method would allow the user to reduce the size, weight, and power consumption of the handheld ultrasound device (Abstract of Choe). Furthermore, the modification allows for post-processing of ultrasound data to improve the image quality (Paragraph 0037 of Choe).
Regarding claim 18, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner (Paragraph 0068 teaches that the motion tracking can be performed with the dedicated hardware to assess the velocity. Micromachined accelerators may be used that are disposed within the transducer and can assess the x, y, z axes and a gyroscope can be used to assess the twisting and rolling. See Figs. 2-3 and 5. Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Paragraph 0066 discuss these actions in detail. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified apparatus would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Regarding claim 19, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner (Paragraph 0068 teaches that the motion tracking can be performed with the dedicated hardware to assess the velocity. Micromachined accelerators may be used that are disposed within the transducer and can assess the x, y, z axes and a gyroscope can be used to assess the twisting and rolling. See Figs. 2-3 and 5. Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display. The capture of the image is done with a curve as noted in paragraph 0066. See Fig. 5).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of detection of a scanning mode based on a scanning motion. This modified apparatus would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Regarding claim 21, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Paragraph 0068 teaches that the motion tracking can be performed with the dedicated hardware to assess the velocity. Micromachined accelerators may be used that are disposed within the transducer and can assess the x, y, z axes and a gyroscope can be used to assess the twisting and rolling. See Figs. 2-3 and 5. Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display. The capture of the image is done with a curve as noted in paragraph 0066. See Fig. 5).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of inputting in a navigation mode. This modified apparatus would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Regarding claim 22, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, wherein, in scanning mode, the parameter is data selected from the group consisting of imaging parameters, beamforming parameters and configuration settings (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used).
Regarding claim 23, modified Pelissier teaches the method in claim 15, as discussed above.
However, Pelissier is silent regarding a method, wherein in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Wilcox teaches a method, wherein in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Paragraph 0068 teaches that the motion tracking can be performed with the dedicated hardware to assess the velocity. Micromachined accelerators may be used that are disposed within the transducer and can assess the x, y, z axes and a gyroscope can be used to assess the twisting and rolling. See Figs. 2-3 and 5. Abstract teaches method and system convert at least one of a predetermined plurality of motion patterns imparted by an operator of the system to the transducer into the operational command signal. Paragraph 0048 teaches that the connection between the workstation and the transducer may be wireless. Paragraph 0078 teaches that the patterns of motion are assessed in real time and converted to control signals. Fig. 6 provides various actions that include image capture, control as a mouse courser, entering report information, proceeding to other parts of a protocol and changing the imaging parameters, and controlling the display. Figs. 7-8 teaches the assessment of the motion to dictate the command. Paragraph 0044 teaches scan conversion for ready display. The capture of the image is done with a curve as noted in paragraph 0066. See Fig. 5).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Wilcox’s teaching of inputting in a navigation mode. This modified apparatus would allow the user to reduce the number of times the operator must touch controls on the workstation (Paragraph 0005 of Wilcox). Furthermore, the modification removes instances that the user will be in awkward positions for simultaneously reaching the controls with the free hand and placing the frontal portion of the transducer in the proper position on the patient's body and will not be prone to error like speech recognition (Paragraph 0003 of Wilcox).
Claims 5 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Wilcox et al. (PGPUB No. US 2007/0078340) further in view of Choe et al. (PGPUB No. US 2018/0271482) further in view of Torp et al. (PGPUB No. US 2014/0187950).
Regarding claim 5, modified Pelissier teaches the method in claim 1, as discussed above.
However, the combination of Pelissier, Wilcox, and Choe is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an activation of a button on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an activation of a button on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6)
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier, Wilcox, and Choe with Torp’s teaching of detecting image mode based on a button activation. This modified method would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). Furthermore, the utilization of buttons allows for easy control inputs.
Regarding claim 17, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, the combination of Pelissier, Wilcox, and Choe is silent regarding a ultrasonic scanner, which enters scanning mode by detecting activation of a button, on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, which enters scanning mode by detecting activation of a button, on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier, Wilcox, and Choe with Torp’s teaching of detecting image mode based on a button activation. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Claim 12 is are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Wilcox et al. (PGPUB No. US 2007/0078340) further in view of Choe et al. (PGPUB No. US 2018/0271482) further in view of Poland (PGPUB No. US 2015/0245816).
Regarding claim 12, modified Pelissier teaches the method in claim 11, as discussed above.
However, the combination of Pelissier, Wilcox, and Choe is silent regarding a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen (Paragraphs 0043 and 0043 teaches that the docking unit 16 is automatically booted from a hibernated state upon connection with the mobile device 18. This allows for the use of the device 18 hardware including the display 26 used to display the ultrasound imaging information from the probe. Paragraph 0015 teaches the hibernation boot up allows for a rapid reboot. Paragraph 0026 teaches the mobile display device becomes dedicated to the ultrasound scanner function when docked).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier, Wilcox, and Choe with Poland’s teaching of a screen wakes up upon information reception with an ultrasound system. This modified method would provide the user with improved ultrasound techniques in terms of costs, portability and multipurpose functionality (Paragraph 0008 of Poland).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Wilcox et al. (PGPUB No. US 2007/0078340) further in view of Choe et al. (PGPUB No. US 2018/0271482) further in view of Nefos (PGPUB No. US 2005/0228281).
Regarding claim 20, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, the combination of Pelissier, Wilcox, and Choe is silent regarding an ultrasound scanner, comprising a button, which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
Adjusts a depth of the ultrasound media displaying on the screen;
Adjusts a gain of the ultrasound media displaying on the screen; or freezes an image of the ultrasound media displaying on the screen.
In an analogous imaging field of endeavor, regarding the transmission and control of ultrasound systems, Nefos teaches an ultrasound scanner, comprising a button (Keypad 20), which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
adjusts a depth of the ultrasound media displayed on the screen (Paragraph 0125 teaches that the key 34 is used to determine the depth for the Doppler mode to be set on);
adjusts a gain of the ultrasound media displayed on the screen (Paragraph 0122 teaches that the required gain can be selected with the keys 33 and 35); or
freezes an image of the ultrasound media displayed on the screen (Paragraph 0048 teaches that the probe contains a freeze button so any displayed image can be held).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier, Wilcox, and Choe with Nefos’s teaching of adjusting gain and depth and freezing an image with button input. This modified apparatus would provide the user with the ability to study an image with closer inspection or measurement (Paragraph 0042 of Nefos).
Claims 1-2, 6-9, 11, 14-15, 18-19, and 21-23 are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Poland (US Patent No. 11,553,895).
Regarding claim 1, Pelissier teaches a method for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the ultrasound scanner broadcasts an identification signal, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.8 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the ultrasound scanner detects a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the ultrasound scanner establishes the wireless communication link with the interface (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 teaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device);
based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data); and
while in scanning mode, sends data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the communications link;
the ultrasound scanner, operable in multiple modes comprising a navigation mode and a scanning mode, and upon establishment of the communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode;
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method,
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Col. 4, lines 25-41 teaches that the probe can wirelessly communicate with the monitor that is mounted on the wall. The connection is a ultra wideband radio communication with the ultrasound probe. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
the ultrasound scanner, operable in a navigation mode and a scanning mode, and upon establishment of the wireless communication link with the interface, enters navigation mode and automatically and without user intervention wirelessly sends a first mode signal to the interface, to also enter navigation mode (Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
the interface automatically signals to the screen to display a navigation user interface (Col. 7, lines 49-Col. 8, lines 32 teaches once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. The system display will show the current and selectable configuration/control states of the system. On-screen menus, for instance, may be navigated by the motions of the probe while it is in control mode. See Fig. 6);
based on a user input, the ultrasound scanner enters scanning mode and sends a second mode signal to the interface to operate in scanning mode (Col. 8, lines 33-58 teaches to return to the imaging mode, the user may employ the same probe motion so as to toggle the state of the system back to imaging, or use a motion in the reverse direction from the original. Col. 7, lines 49-Col. 8, lines 32 teaches wireless connection is done via a radio link); and,
while in scanning mode, sends data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Col. 10, lines 39-64 teaches the dongle accepts image line data from the probe and generates an ultrasound image for display. Image processing comprises image persistence, axial and lateral filtering, scan conversion, image smoothing and enhancement, spatial compounding, thresholding, zoom, pan, harmonic image display, and zone focus zone stitching. The Control block receives commands from the user interface and converts the commands into signals which define the next “State” of the system, that is, what the system is to do or image next. The State is reflected back to the user as “Feedback”, confirming to the user that an issued command is being put into effect. The Coefficient Generation block translates the State signals into specific coefficients for the probe hardware which implement the State of the system desired by the user. State signals are also coupled to and used by the Image Processing block to produce an image of the desired type and format (sector, linear, grayscale, color flow, etc.) commanded by the user).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging and use of a wall mounted screen. This modified method would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 2, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, comprising the ultrasound scanner operating as a mobile communications network hotspot or using a short-range communication protocol to broadcast the identification signal and establish the wireless communication link (Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi).
Regarding claim 6, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting a gesture of the ultrasound scanner (Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of detecting an ultrasound scanning gesture. This modified method would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 7, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an ultrasound scanning motion of the ultrasound scanner (Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol. Fig. 6 shows motions similar to scanning motions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of detecting the ultrasound scanning motion of the ultrasound scanner. This modified method would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 8, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, additionally comprising a step of receiving ultrasound input at the ultrasound scanner, in scanning mode, to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals).
Regarding claim 9, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, further comprising:
after the ultrasound scanner broadcasts the identification signal, the interface detecting the identification signal broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. See modified Fig. 5 above), and
requesting to establish the communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface participating, in the establishing of the communication link (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface receiving, the data from the ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. This is the ultrasound imaging);
the interface receiving, the parameter (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device);
the interface using, the parameter to convert the data into the ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on these parameters by the ultrasound device and the information is transmitted back to the display device).
Regarding claim 11, Pelissier teaches a method, method for establishing for establishing a wireless communication link between an ultrasound scanner and an interface comprising:
the interface listening for an identification signal broadcast from the ultrasound scanner, the interface being configured to listen for the identification signal prior to the identification signal being broadcasted from the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.9 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
the interface detecting the identification signal, and requesting to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface establishing the wireless communication link to the ultrasound scanner (Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
the interface automatically signals to the screen to display a navigation user interface (Paragraph 0050 t4eaches that the display device can be used for control signals for the performance of ultrasound imaging. Paragraph 0067 teaches the display and determination of preliminary data. Paragraph 0081 teaches that the ultrasound imaging selection information can be input on the display device), and
based on a user input, the ultrasound scanner is directed to enter scanning mode, and the interface wirelessly receives from the ultrasound scanner, a second mode signal to enter scanning mode (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used. The images are generated based on this parameters by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data);
the interface, while in scanning mode, receiving data from an ultrasound scan (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102);
the interface receiving from the scanner, a parameter (Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device);
the interface using the parameter to convert the data into an ultrasound media (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used.); and
the interface sending the ultrasound media to the screen for display thereon (Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding a method,
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link;
wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method,
the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, automatically activates the screen upon establishment of the wireless communications link (Col. 4, lines 25-41 teaches that the probe can wirelessly communicate with the monitor that is mounted on the wall. The connection is a ultra wideband radio communication with the ultrasound probe. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
wirelessly receiving from the ultrasound scanner, which is operable in a navigation mode and a scanning mode, upon establishment of the wireless communications link, a first mode signal to enter navigation mode, automatically and without user intervention, wherein, with the ultrasound scanner and the interface both being in the navigation mode (Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
the interface automatically signals to the screen to display a navigation user interface (Col. 7, lines 49-Col. 8, lines 32 teaches once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. The system display will show the current and selectable configuration/control states of the system. On-screen menus, for instance, may be navigated by the motions of the probe while it is in control mode. See Fig. 6);
based on a user input, the ultrasound scanner is directed to enter scanning mode, and the interface wirelessly receives from the ultrasound scanner, a second mode signal to enter scanning mode (Col. 8, lines 33-58 teaches to return to the imaging mode, the user may employ the same probe motion so as to toggle the state of the system back to imaging, or use a motion in the reverse direction from the original. Col. 7, lines 49-Col. 8, lines 32 teaches wireless connection is done via a radio link);
the interface, while in scanning mode, receiving data from an ultrasound scan, the interface receiving from the scanner, a parameter; the interface using the parameter to convert the data into an ultrasound media; and the interface sending the ultrasound media to the screen for display thereon (Col. 10, lines 39-64 teaches the dongle accepts image line data from the probe and generates an ultrasound image for display. Image processing comprises image persistence, axial and lateral filtering, scan conversion, image smoothing and enhancement, spatial compounding, thresholding, zoom, pan, harmonic image display, and zone focus zone stitching. The Control block receives commands from the user interface and converts the commands into signals which define the next “State” of the system, that is, what the system is to do or image next. The State is reflected back to the user as “Feedback”, confirming to the user that an issued command is being put into effect. The Coefficient Generation block translates the State signals into specific coefficients for the probe hardware which implement the State of the system desired by the user. State signals are also coupled to and used by the Image Processing block to produce an image of the desired type and format (sector, linear, grayscale, color flow, etc.) commanded by the user).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging and use of a wall mounted screen. This modified method would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 14, modified Pelissier teaches the method in claim 11, as discussed above.
Pelissier further teaches a method, comprising:
the interface, when in scanning mode, receiving from the ultrasound scanner, a command to:
adjust a depth of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the depth of the scan lines. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals);
adjust a gain of the ultrasound media displayed on the screen (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Paragraph 0054 teaches that the control signals result in the control of the ultrasound device 104 which in turn controls the ultrasound signals); and
causing the screen to alter the displayed ultrasound media according to the command (Paragraph 0097 teaches that the imaging parameters used by the ultrasound imaging device 104 includes the gain of the ultrasound signals. Paragraph 0010 teaches that the ultrasound imaging devices’ selection information is a controllable function. Fig. 5 shows that the image configuration information is sent in S538, before the ultrasound data and display steps occur in S548 and S550).
Regarding claim 15, Pelissier teaches an ultrasound scanner that establishes a wireless communication link with an interface, comprising:
a processor (Processor 140); and
computer readable memory (Memory 144) storing computer readable instructions, which, when executed by the processor cause the ultrasound scanner, without human intervention, and after the ultrasound scanner is switched on (Paragraph 0124 teaches that the ultrasound imaging device selection and connection is done automatically. Abstract teaches that the communication between the devices occurs when the ultrasound device is in standby state. Paragraph 0083 teaches that the selection of an ultrasound imaging device for pairing may be automated. Paragraph 0086 teaches that the processor can automate all of the selection. Paragraph 0124 teaches that such automation reduces time to start scanning and makes workflow easier), to:
broadcast an identification signal, the interface, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Paragraph 0106 teaches that the ultrasound imaging device 104 is periodically sending a wireless communication advertisement signal to the multi-use display device 102. See Fig. 5. Paragraph 0041 teaches that the Fig. 5 shows the signal flow diagram between the two communicating methods. Paragraph 0107 teaches that the multi-use display device 102 is in a discovery process and an advertisement signal is then received. It is well-known that “discovery” is where one device listens for another during inquiry.10 See modified Fig. 5 below. Paragraphs 0106-0108 teach that the connection between the ultrasound imaging device and display device is via Bluetooth. Paragraph 0112 teaches that the two devices 102 and 104 can be connected via Wi-Fi);
PNG
media_image1.png
691
621
media_image1.png
Greyscale
Modified Fig. 5
detect a request from the interface to establish the wireless communication link with the ultrasound scanner (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
establish the wireless communication link with the interface (Paragraph 0107 teaches that the display device 102 transmits an information request signal to the ultrasound imaging device 104 to which the ultrasound device 104 responds to. Paragraph 0108 teaches that the display device 102 and imaging device 104 establish a connection via wireless communication. See Fig. 5);
and thereafter the interface automatically signals to the screen to display a navigation user interface (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device);
based on a user input, enters scanning mode;send, to the interface, a second mode signal directing the interface to operate in scanning mode, data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Paragraph 0114 teaches that the ultrasound data is sent from the imaging device 10 to the display device 102. Paragraph 0098 teaches that the display device 102 receives the ultrasound data acquired from the ultrasound imaging device. Paragraph 0115 teaches that the ultrasound data is displayed on the display device 102 via the interface 126 or a secondary display. Paragraphs 0130-0131 teach that the display devices determine the initial set of imaging parameters to be used. The images are generated based on this parameter by the ultrasound device and the information is transmitted back to the display device. Paragraph 0030 teaches that the user control input can prompt the probe into an active state. Paragraphs 0090-91 teaches that the activation of the probe then facilitates the ultrasound imaging and transmission and display of ultrasound data).
Furthermore, Pelissier teaches the wireless connection between the ultrasound scanner and the interface such that it is automatic and without user intervention (Paragraph 0033 teaches that the display can automatically select and operate with the ultrasound device that is closest or has worked with in the past. Paragraph 0062 teaches that the preliminary connects can be Bluetooth, Wi-Fi, Zigbee, LAN UWB, RF, etc. It is well-known that wirelessly sending a signal to an interface cannot be performed by a human).
However, Pelissier is silent regarding an ultrasound scanner, broadcast an identification signal, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner;
upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method,
broadcast an identification signal, the interface, which controls a screen selected from the group consisting of a wall-mounted screen, a smart watch screen, and wearable augmented reality glasses, being configured to listen for the identification signal prior to the identification signal being broadcasted by the ultrasound scanner (Col. 4, lines 25-41 teaches that the probe can wirelessly communicate with the monitor that is mounted on the wall. The connection is a ultra wideband radio communication with the ultrasound probe. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
upon establishment of the wireless communications link with the interface, the ultrasound scanner, operable in a navigation mode and a scanning mode, enters navigation mode and then automatically and without user intervention, wirelessly sends, to the interface, a first mode signal to also enter navigation mode (Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol);
and thereafter the interface automatically signals to the screen to display a navigation user interface (Col. 7, lines 49-Col. 8, lines 32 teaches once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. The system display will show the current and selectable configuration/control states of the system. On-screen menus, for instance, may be navigated by the motions of the probe while it is in control mode. See Fig. 6);
based on a user input, enters scanning mode; send, to the interface, a second mode signal directing the interface to operate in scanning mode, data from an ultrasound scan and a parameter that is used by the interface to convert the data into an ultrasound media for display on the screen (Col. 8, lines 33-58 teaches to return to the imaging mode, the user may employ the same probe motion so as to toggle the state of the system back to imaging, or use a motion in the reverse direction from the original. Col. 7, lines 49-Col. 8, lines 32 teaches wireless connection is done via a radio link. Col. 10, lines 39-64 teaches the dongle accepts image line data from the probe and generates an ultrasound image for display. Image processing comprises image persistence, axial and lateral filtering, scan conversion, image smoothing and enhancement, spatial compounding, thresholding, zoom, pan, harmonic image display, and zone focus zone stitching. The Control block receives commands from the user interface and converts the commands into signals which define the next “State” of the system, that is, what the system is to do or image next. The State is reflected back to the user as “Feedback”, confirming to the user that an issued command is being put into effect. The Coefficient Generation block translates the State signals into specific coefficients for the probe hardware which implement the State of the system desired by the user. State signals are also coupled to and used by the Image Processing block to produce an image of the desired type and format (sector, linear, grayscale, color flow, etc.) commanded by the user).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging and use of a wall mounted screen. This modified apparatus would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 18, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting a gesture of the ultrasound scanner (Col. 7, lines 49-Col. 8, lines 32 teaches that the probe contains a 3-axis or 6-axis motion sensor such as an accelerometer. The accelerometer is used to sense probe motions for system control changes. The signals produced by the accelerometer are processed to produce indications of probe motion (acceleration, velocity, direction of motion, probe orientation) by a microcontroller in the wireless probe or by the acquisition and signal conditioning FPGA described above. Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of the wireless connection between scanner and probe and the operation such that the probe can control the interface and perform imaging. This modified apparatus would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 19, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, Pelissier is silent regarding an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches an ultrasound scanner, comprising a multi-axis motion sensor, wherein the computer readable instructions, when executed by the processor, cause the ultrasound scanner to: detect that the ultrasound scanner is in scanning mode by detecting an ultrasound scanning motion of the ultrasound scanner (Col. 7, lines 49-Col. 8, lines 32 teaches that the probe contains a 3-axis or 6-axis motion sensor such as an accelerometer. The accelerometer is used to sense probe motions for system control changes. The signals produced by the accelerometer are processed to produce indications of probe motion (acceleration, velocity, direction of motion, probe orientation) by a microcontroller in the wireless probe or by the acquisition and signal conditioning FPGA described above. Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. The scanning mode performs imaging and the control mode allows for the user interface to be controlled via the probe. The probe is wirelessly connected to the system as it provides a range of unimpeded motions available is broader with an un-cabled probe. The wireless connection is done via a radio link. Motion is to spin the probe 180 or 360 degrees in the palm of the hand as indicated in Fig. 6 facilitates the control to the control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. Col. 11, lines 33-67 teaches that the connection with the probe is an automatic connection via a pairing protocol. Fig. 6 shows motions similar to scanning motions).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of detection of a scanning mode based on a scanning motion. This modified apparatus would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 21, modified Pelissier teaches the method in claim 1, as discussed above.
However, Pelissier is silent regarding a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein, in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Col. 11, lines 14-32 teaches user interface commands issued by the user by means of the system user interface such as probe gestures. Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. See Fig. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of inputting in a navigation mode. This modified apparatus would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Regarding claim 22, modified Pelissier teaches the method in claim 1, as discussed above.
Pelissier further teaches a method, wherein, in scanning mode, the parameter is data selected from the group consisting of imaging parameters, beamforming parameters and configuration settings (Paragraphs 0130-0131 teach that the display devices determines the initial set of imaging parameters to be used).
Regarding claim 23, modified Pelissier teaches the method in claim 15, as discussed above.
However, Pelissier is silent regarding a method, wherein in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland teaches a method, wherein in navigation mode, the ultrasound scanner sends to the interface at least one of the following: navigation inputs, option selections and alphanumeric data input (Col. 11, lines 14-32 teaches user interface commands issued by the user by means of the system user interface such as probe gestures. Col. 7, lines 49-Col. 8, lines 32 teaches the operation in the scanning and control mode. Once the system has been switched to the control mode, the probe is used as a remote user interface controller, wherein subsequent motions of the probe select and activate configuration parameters for scanning, such as the imaging mode (2D, Flow, CW, PW, etc.), the image Gain or Depth, or other system controls such as image review scrolling and capture and transfer of previously acquired still or cine loop images to storage. See Fig. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify Pelissier with Poland’s teaching of inputting in a navigation mode. This modified apparatus would allow the user to have ultrasound system portability with a system configuration that eliminates the need to transport a diagnostically useful display screen (Col. 1, lines 13-39 of Poland). Furthermore, the modification is an improvement as it is provides range of unimpeded motions available is broader with an un-cabled probe (Col. 7, lines 49-Col. 8, lines 32 of Poland).
Claims 5 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Poland (US Patent No. 11,553,895) further in view of Torp et al. (PGPUB No. US 2014/0187950).
Regarding claim 5, modified Pelissier teaches the method in claim 1, as discussed above.
However, the combination of Pelissier and Poland is silent regarding a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an activation of a button on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches a method, wherein the ultrasound scanner enters scanning mode by the ultrasound scanner detecting an activation of a button on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6)
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Poland with Torp’s teaching of detecting image mode based on a button activation. This modified method would allow a user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). Furthermore, the utilization of buttons allows for easy control inputs.
Regarding claim 17, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, the combination of Pelissier and Poland is silent regarding a ultrasonic scanner, which enters scanning mode by detecting activation of a button, on the ultrasound scanner.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Torp teaches an ultrasound scanner, which enters scanning mode by detecting activation of a button, on the ultrasound scanner (Paragraph 0041 teaches that the mode can be toggled based on the control of the switch 155. Paragraph 0002 teaches that the button instructs when to perform the scan. See Fig. 6).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Poland with Torp’s teaching of detecting image mode based on a button activation. This modified apparatus would allow the user to accurately image in a manner that is less burdensome and less prone to provide corrupted datasets (Paragraph 0002 of Torp). The position and orientation are done with enhanced accuracy and precision (Paragraph 0038 of Torp).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Poland (US Patent No. 11,553,895) further in view of Poland (PGPUB No. US 2015/0245816; hereinafter referred to as “Poland ‘816”).
Regarding claim 12, modified Pelissier teaches the method in claim 11, as discussed above.
However, the combination of Pelissier and Poland is silent regarding a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen.
In an analogous imaging field of endeavor, regarding the ultrasound communication connection to external systems, Poland ‘816 teaches a method, wherein the screen is in a standby or hibernation state and listens for the identification signal broadcast from the ultrasound scanner, the method comprising the interface switching on the screen before sending the ultrasound media to the screen (Paragraphs 0043 and 0043 teaches that the docking unit 16 is automatically booted from a hibernated state upon connection with the mobile device 18. This allows for the use of the device 18 hardware including the display 26 used to display the ultrasound imaging information from the probe. Paragraph 0015 teaches the hibernation boot up allows for a rapid reboot. Paragraph 0026 teaches the mobile display device becomes dedicated to the ultrasound scanner function when docked).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Poland with Poland ‘816’s teaching of a screen wakes up upon information reception with an ultrasound system. This modified method would provide the user with improved ultrasound techniques in terms of costs, portability and multipurpose functionality (Paragraph 0008 of Poland ‘816).
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Pelissier et al. (PGPUB No. US 2016/0278739) in view of Poland (US Patent No. 11,553,895) further in view of Nefos (PGPUB No. US 2005/0228281).
Regarding claim 20, modified Pelissier teaches the ultrasound scanner in claim 15, as discussed above.
However, the combination of Pelissier and Poland is silent regarding an ultrasound scanner, comprising a button, which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
Adjusts a depth of the ultrasound media displaying on the screen;
Adjusts a gain of the ultrasound media displaying on the screen; or freezes an image of the ultrasound media displaying on the screen.
In an analogous imaging field of endeavor, regarding the transmission and control of ultrasound systems, Nefos teaches an ultrasound scanner, comprising a button (Keypad 20), which, when activated, causes a command to be transmitted by the ultrasound scanner to the interface, when the ultrasound scanner and interface are in scanning mode, that:
adjusts a depth of the ultrasound media displayed on the screen (Paragraph 0125 teaches that the key 34 is used to determine the depth for the Doppler mode to be set on);
adjusts a gain of the ultrasound media displayed on the screen (Paragraph 0122 teaches that the required gain can be selected with the keys 33 and 35); or
freezes an image of the ultrasound media displayed on the screen (Paragraph 0048 teaches that the probe contains a freeze button so any displayed image can be held).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the combination of Pelissier and Poland with Nefos’s teaching of adjusting gain and depth and freezing an image with button input. This modified apparatus would provide the user with the ability to study an image with closer inspection or measurement (Paragraph 0042 of Nefos).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ADIL PARTAP S VIRK whose telephone number is (571)272-8569. The examiner can normally be reached Mon-Fri 8-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached on 571-272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ADIL PARTAP S VIRK/Primary Examiner, Art Unit 3798
1 “Bluetooth, technology standard used to enable short-range wireless communication between electronic devices. Bluetooth was developed in the late 1990s and soon achieved massive popularity in consumer devices.” (Link: https://www.britannica.com/technology/Bluetooth)
“Wi-Fi can also be used to provide wireless broadband Internet access for many modern devices, such as laptops, smartphones, tablet computers, and electronic gaming consoles. Wi-Fi-enabled devices are able to connect to the Internet when they are near areas that have Wi-Fi access, called ‘hotspots.’” (Link: https://www.britannica.com/technology/Wi-Fi)
2 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
3 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
4 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
5 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
6 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
7 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
8 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
9 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)
10 “If two Bluetooth devices know absolutely nothing about each other, one must run an inquiry to try to discover the other. One device sends out the inquiry request, and any device listening for such a request will respond with its address, and possibly its name and other information.” (Link: https://learn.sparkfun.com/tutorials/bluetooth-basics/how-bluetooth-works)