DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This action is in response to the remarks filed on 06/26/2025.
The amendments filed on 06/26/2025 have been entered. Accordingly claims 1-6, 8-11, 13-24, and 26 remain pending. Claims 13-20 were previously withdrawn from consideration. Independent claims 1, 13, and 21 are presently amended.
Examiner notes that an interview request was received by the office on 06/09/2025 while the examiner was on extended leave. Upon returning to the office, examiner called attorney of record R. Scott McClelland (Reg. no. 68,257) on 08/11/2025. Applicant’s counsel requested an interview upon completion of examination if there were matters to be discussed to advance prosecution.
Response to Arguments
Applicant’s arguments with respect to amended independent claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. In particular, Buras is no longer relied on to teach the feedback limitation or the determination of whether the inertial status is consistent with the ultrasound exam function.
Claim Objections
Claims 1 and 21 are objected to because of the following informalities:
Regarding claim 1, the limitation “the determination” in line 10 may be considered to lack antecedent basis. Examiner has interpreted the determination to be that of whether the inertial status is consistent with the ultrasound exam function recited in line 8. Therefore it is suggested to amend the limitation in line 8 to e.g., --make a determination of [[determine]] whether the inertial status is consistent with the ultrasound exam function—in order to clearly provide proper antecedent basis for the limitation, “the determination”, in line 10. This also applies to the analogous limitations of claim 21.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-6, 8-9, 26, and 21-24 are rejected under 35 U.S.C. 103 as being unpatentable over Kremsl et al. (US 2023/0181159 A1, filed 12/10/2021, hereinafter "Kremsl") in view of Shoudy et al. (US 2020/0214667, July 9, 20202, hereinafter “Shoudy”).
Regarding claim 1, Kremsl discloses an ultrasound imaging system with tactile probe control (Kremsl: Title), and further discloses:
An ultrasound imaging device ("ultrasound imaging system" Kremsl: [0001]) including a sensor circuitry ("an integrated circuit" Kremsl: [0012]) and a housing ("ultrasound imaging probe includes a housing" Kremsl: [0012]), the sensor circuitry disposed in the housing and coupled thereto ("an integrated circuit disposed within the housing" Kremsl: [0012]) to:
sense a pattern of haptic inputs at a surface area of the housing ("at least one motion sensor operably connected to the microcontroller and capable of sensing motion of the housing based on at least one operator interaction with the housing to generate sensor data representing the at least one operator interaction" Kremsl: [0012]; also see “Each command for a particular operational configuration of the imaging system and/or probe 106 is associated with a predetermined type, number and/or pattern of interactions performed by the operator on the housing 300.” [0040]; also see [0053]);
send to a computing system ("a microcontroller" Kremsl: [0012]) information based on the pattern of haptic inputs to cause an ultrasound exam function to be executed at the computing system ("microcontroller is configured to receive sensor data from the at least one motion sensor in response to at least one operator interaction with the housing, to compare the received sensor data with the stored sensor data, and to change an operational configuration of the ultrasound imaging probe in response to matching the received sensor data with stored sensor data" Kremsl: [0012]), the ultrasound exam function to control an ultrasound image (“when an operator taps 2 times on the probe 106/housing 300/handle 306, the imaging system 100/probe 106 increases image depth” [0055]) on a display of the computing system ("a display operably connected to the processing unit to present the created ultrasound images to a user" Kremsl: [0013]);
sense an inertial status of the ultrasound imaging device (“the one or more sensor boards 308 are each formed as an integrated circuit/printed circuit board assembly (PCBA) 312. The PCBA 312 includes a main board 314 including the sensors 310 mounted thereto. The sensors 310 disposed on the main board 314 are formed as any suitable type of motion sensor 310, including but not limited to, a drop sensor, an accelerometer, a pressure sensor, a capacitance sensor, and/or a gyroscopic sensor, or combinations thereof, among others. The PCBA 312 can also include sensors 310 other than motion sensors, such as a temperature sensor, a light sensor 315, and/or a humidity sensor, among others. The motion sensors 310 disposed on the main board 314 are capable of sensing actions performed on the housing 300 by the operator and using the PCBA 312 and/or processing unit 120 to interpret the actions into controls regarding the operation of the imaging system 100 and/or probe 106.” Kremsl: [0032]).
Although Kremsl discloses sensing the inertial status of the ultrasound imaging device as stated above, Kremsl is not relied on for teaching:
determine whether the inertial status is consistent with the ultrasound exam function; and
cause feedback to be communicated to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user.
However, in a similar invention in the same field of endeavor, Shoudy teaches sensing an inertial status of the ultrasound imaging device (“the ultrasound probe may each have one or more position and/or orientation sensors, such as inertial measurement units” Shoudy: [0023]; also see [0037]);
determine whether the inertial status is consistent with the ultrasound exam function (“The processor may determine a current position of the ultrasound probe relative to a region of interest of a patient. The region of interest may include a desired anatomy, a desired scan plane, or both, to be imaged via the ultrasound probe. The processor may also determine whether the current position of the ultrasound probe corresponds to a desired position of the ultrasound probe relative to the region of interest. The desired position of the ultrasound probe facilitates an acquisition of an ultrasound image of the region of interest.” Shoudy: [0004]; examiner notes that the desired anatomical feature and/or desired scan plane of Shoudy is interpreted as the claimed ultrasound exam function, examiner further notes that the desired anatomical feature and/or desired scan plane is taught to be a haptic input by Shoudy in e.g., [0077], finally examiner notes that although the ultrasound exam function and the sensing of haptic inputs to cause an ultrasound exam function are also taught by Shoudy, these limitations are already disclosed by primary reference Kremsl as shown above); and
cause feedback to be communicated to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user (“ In response to determining that the current position of the ultrasound probe does not correspond to the desired position of the ultrasound probe, the processor may transmit a control signal to a plurality of haptic actuators. The control signal may cause the plurality of actuators to pulse in a actuation pattern indicative of a suggested movement of the ultrasound probe to position the ultrasound probe in the desired position.” Shoudy: [0002]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the ultrasound imaging system disclosed by Kremsl, by determining whether the inertial status is consistent with the ultrasound exam function; and causing feedback to be communicated to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user as taught by Shoudy. One of ordinary skill in the art would have been motivated to make this modification because “such techniques may allow novice, or less skilled, users to obtain accurate ultrasound scans of desired anatomical features of a patient” (Shoudy: [0022]). Thus, this modification provides improved usage by users of varying proficiencies to achieve a desired outcome.
Regarding claim 2, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the sensor circuitry is further to send information based on the haptic input to a sensor signal processing circuitry ("memory chip disposed within the housing and operably connected to the microcontroller, the memory chip storing information regarding particular operational configurations for the ultrasound imaging probe associated with stored sensor data representing types of stored operator interactions" Kremsl: [0012]),
the sensor signal processing circuitry to determine a correlation between the sensed pattern of haptic input and one or more ultrasound exam functions to be executed at the computing system ("microcontroller is configured to receive sensor data from the at least one motion sensor in response to at least one operator interaction with the housing, to compare the received sensor data with the stored sensor data, and to change an operational configuration of the ultrasound imaging probe in response to matching the received sensor data with stored sensor data" Kremsl: [0012]).
Regarding claim 3, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the sensor circuitry includes an accelerometer ("sensors 310 disposed on the main board 314 are formed as any suitable type of motion sensor 310, including but not limited to, a drop sensor, an accelerometer, a pressure sensor, a capacitance sensor, and/or a gyroscopic sensor, or combinations thereof, among others" Kremsl: [0032]).
Regarding claim 4, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 3, as described above.
Kremsl further discloses:
wherein the sensor circuitry further includes a gyroscope ("sensors 310 disposed on the main board 314 are formed as any suitable type of motion sensor 310, including but not limited to, a drop sensor, an accelerometer, a pressure sensor, a capacitance sensor, and/or a gyroscopic sensor, or combinations thereof, among others" Kremsl: [0032]).
Regarding claim 5, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 4, as described above.
Kremsl further discloses:
wherein the sensor circuitry includes a sensor device and a sensor processing circuitry coupled to the sensor device ("one or more sensor boards 308 are each formed as an integrated circuit/printed circuit board assembly (PCBA) 312" Kremsl: [0032]), the sensor device including the accelerometer and the gyroscope ("sensors 310 disposed on the main board 314 are formed as any suitable type of motion sensor 310, including but not limited to, a drop sensor, an accelerometer, a pressure sensor, a capacitance sensor, and/or a gyroscopic sensor, or combinations thereof, among others" Kremsl: [0032]), and the sensor processing circuitry to fuse signals corresponding to raw accelerometer data from the accelerometer with signals corresponding to raw gyroscope data from the gyroscope by processing the raw accelerometer data and the raw gyroscope data to generate signals therefrom ("motion sensors 310, and in particular the accelerometer 336 and/or gyro sensor 338 sense/record operator commands via the interaction of the operator directly with the housing 300 and handle 306 for the probe 106 to control the operation of the ultrasound imaging system 100 based on these commands" Kremsl: [0039]) corresponding to the inertial status of the ultrasound imaging device ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
1) tapping/making a finger tap gesture or action with one or more fingers on or against the probe 106/housing 300/handle 306;
2) shaking the probe 106/housing 300/handle 306;
3) waving the probe 106/housing 300/handle 306 in a particular direction or directions;
4) rotating the probe 106/housing 300/handle 306 in a particular direction; and/or
5) inverting the probe 106/housing 300/handle 306, such as for a specified time period" Kremsl: [0042]-[0048]).
Regarding claim 6, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the inertial status of the ultrasound imaging device includes information based on at least one of whether the ultrasound imaging device is stationary, an angular tilt of the ultrasound imaging device with respect to a skin surface of a body being examined, an angular velocity of the ultrasound imaging device with respect to the skin, a position of the ultrasound imaging device on the skin, or a linear velocity of the ultrasound imaging device ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
1) tapping/making a finger tap gesture or action with one or more fingers on or against the probe 106/housing 300/handle 306;
2) shaking the probe 106/housing 300/handle 306;
3) waving the probe 106/housing 300/handle 306 in a particular direction or directions;
4) rotating the probe 106/housing 300/handle 306 in a particular direction; and/or
5) inverting the probe 106/housing 300/handle 306, such as for a specified time period" Kremsl: [0042]-[0048]).
Regarding claim 8, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the haptic input includes one or more taps on a surface of the housing ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
1) tapping/making a finger tap gesture or action with one or more fingers on or against the probe 106/housing 300/handle 306" Kremsl: [0042]-[0043]).
Regarding claim 9, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the haptic input includes aerial motion of the ultrasound imaging device ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
...
2) shaking the probe 106/housing 300/handle 306;
3) waving the probe 106/housing 300/handle 306 in a particular direction or directions;
4) rotating the probe 106/housing 300/handle 306 in a particular direction; and/or
5) inverting the probe 106/housing 300/handle 306, such as for a specified time period" Kremsl: [0042]-[0048]).
Regarding claim 26, the combination of Kremsl and Shoudy discloses:
the ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
wherein the sensor circuitry is to sense a tap sequence at the surface area of the housing (“besides the operator interaction being defined as a single action taken relative to the probe 106/housing 300/handle 306, a single tap, pre-defined tap gesture or action sequences may be utilized. For example, the operator interaction associated with an operator command can be a simple or complex sequence of interactions with the probe 106/housing 300/handle 306. From the simple to the complex, these sequences of interactions can be a number of finger taps on the probe 106/housing 300/handle 306, a sequence of finger taps with pauses between some or all of the finger taps, similar to a Morse code sequence, and a combination of finger tap gestures or actions and other motions of the probe 106/housing 300/handle 306, such as waving, shaking, and/or rotating the probe 106/housing 300/handle 306.” Kremsl: [0053]) and send information about the tap sequence to the computing system to cause an ultrasound exam function to be executed at the computing system ("microcontroller is configured to receive sensor data from the at least one motion sensor in response to at least one operator interaction with the housing, to compare the received sensor data with the stored sensor data, and to change an operational configuration of the ultrasound imaging probe in response to matching the received sensor data with stored sensor data" Kremsl: [0012]).
Regarding claim 21, Kremsl discloses an ultrasound imaging system with tactile probe control (Kremsl: Title), and further discloses:
one or more tangible non-transitory computer-readable storage media comprising a plurality of instructions stored thereon that, when executed, cause one or more processors of an ultrasound imaging device to perform operations (“processing unit” Kremsl: [0013]) including:
obtaining data from sensor circuitry of the ultrasound imaging device (“an integrated circuit disposed within the housing, the integrated circuit comprising a microcontroller, and at least one motion sensor operably connected to the microcontroller and capable of sensing motion of the housing based on at least one operator interaction with the housing to generate sensor data representing the at least one operator interaction” Kremsl: [0013]);
detecting a pattern of haptic input at a surface area of a housing of the ultrasound imaging device based on the data from the sensor circuitry ("an ultrasound imaging probe includes a housing [...] at least one motion sensor operably connected to the microcontroller and capable of sensing motion of the housing based on at least one operator interaction with the housing to generate sensor data representing the at least one operator interaction" Kremsl: [0012]; also see “Each command for a particular operational configuration of the imaging system and/or probe 106 is associated with a predetermined type, number and/or pattern of interactions performed by the operator on the housing 300.” Kremsl: [0040]; also see Kremsl: [0053]);
sending to a computing system ("a microcontroller" Kremsl: [0012]) information based on the pattern of haptic input to cause an ultrasound exam function to be executed at the computing system ("microcontroller is configured to receive sensor data from the at least one motion sensor in response to at least one operator interaction with the housing, to compare the received sensor data with the stored sensor data, and to change an operational configuration of the ultrasound imaging probe in response to matching the received sensor data with stored sensor data" Kremsl: [0012]), the ultrasound exam function to control an ultrasound image (“when an operator taps 2 times on the probe 106/housing 300/handle 306, the imaging system 100/probe 106 increases image depth” [0055]) on a display of the computing system ("a display operably connected to the processing unit to present the created ultrasound images to a user" Kremsl: [0013]);
determining an inertial status of the ultrasound imaging device based on the data from the sensor circuitry (“the one or more sensor boards 308 are each formed as an integrated circuit/printed circuit board assembly (PCBA) 312. The PCBA 312 includes a main board 314 including the sensors 310 mounted thereto. The sensors 310 disposed on the main board 314 are formed as any suitable type of motion sensor 310, including but not limited to, a drop sensor, an accelerometer, a pressure sensor, a capacitance sensor, and/or a gyroscopic sensor, or combinations thereof, among others. The PCBA 312 can also include sensors 310 other than motion sensors, such as a temperature sensor, a light sensor 315, and/or a humidity sensor, among others. The motion sensors 310 disposed on the main board 314 are capable of sensing actions performed on the housing 300 by the operator and using the PCBA 312 and/or processing unit 120 to interpret the actions into controls regarding the operation of the imaging system 100 and/or probe 106.” Kremsl: [0032]).
Although Kremsl discloses determining the inertial status of the ultrasound imaging device based on the data from the sensor circuitry as stated above, Kremsl is not relied on for teaching:
determining whether the inertial status is consistent with the ultrasound exam function; and
communicating feedback to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user.
However, in a similar invention in the same field of endeavor, Shoudy teaches determining an inertial status of the ultrasound imaging device (“the ultrasound probe may each have one or more position and/or orientation sensors, such as inertial measurement units” Shoudy: [0023]; also see [0037]);
determining whether the inertial status is consistent with the ultrasound exam function (“The processor may determine a current position of the ultrasound probe relative to a region of interest of a patient. The region of interest may include a desired anatomy, a desired scan plane, or both, to be imaged via the ultrasound probe. The processor may also determine whether the current position of the ultrasound probe corresponds to a desired position of the ultrasound probe relative to the region of interest. The desired position of the ultrasound probe facilitates an acquisition of an ultrasound image of the region of interest.” Shoudy: [0004]; examiner notes that the desired anatomical feature and/or desired scan plane of Shoudy is interpreted as the claimed ultrasound exam function, examiner further notes that the desired anatomical feature and/or desired scan plane is taught to be a haptic input by Shoudy in e.g., [0077], finally examiner notes that although the ultrasound exam function and the sensing of haptic inputs to cause an ultrasound exam function are also taught by Shoudy, these limitations are already disclosed by primary reference Kremsl as shown above); and
communicating feedback to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user (“ In response to determining that the current position of the ultrasound probe does not correspond to the desired position of the ultrasound probe, the processor may transmit a control signal to a plurality of haptic actuators. The control signal may cause the plurality of actuators to pulse in a actuation pattern indicative of a suggested movement of the ultrasound probe to position the ultrasound probe in the desired position.” Shoudy: [0002]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the ultrasound imaging system disclosed by Kremsl, by determining whether the inertial status is consistent with the ultrasound exam function; and communicating feedback to a user of the ultrasound imaging device based on the determination, the feedback corresponding to an adjustment of an ultrasound examination by the user as taught by Shoudy. One of ordinary skill in the art would have been motivated to make this modification because “such techniques may allow novice, or less skilled, users to obtain accurate ultrasound scans of desired anatomical features of a patient” (Shoudy: [0022]). Thus, this modification provides improved usage by users of varying proficiencies to achieve a desired outcome.
Regarding claim 22, the combination of Kremsl and Shoudy discloses:
The storage media of claim 21, as described above.
Kremsl further discloses:
further including sending information based on the haptic input to a sensor signal processing circuitry ("memory chip disposed within the housing and operably connected to the microcontroller, the memory chip storing information regarding particular operational configurations for the ultrasound imaging probe associated with stored sensor data representing types of stored operator interactions" Kremsl: [0012]),
the sensor signal processing circuitry to determine a correlation between the sensed pattern of haptic input and one or more ultrasound exam functions to be executed at the computing system ("microcontroller is configured to receive sensor data from the at least one motion sensor in response to at least one operator interaction with the housing, to compare the received sensor data with the stored sensor data, and to change an operational configuration of the ultrasound imaging probe in response to matching the received sensor data with stored sensor data" Kremsl: [0012]).
Regarding claim 23, the combination of Kremsl and Shoudy discloses:
The storage media of claim 21, as described above.
Kremsl further discloses:
further including fusing signals corresponding to raw accelerometer data from the accelerometer with signals correspond to raw gyroscope data from the gyroscope by processing the raw accelerometer data and the raw gyroscope data to generate signals therefrom ("motion sensors 310, and in particular the accelerometer 336 and/or gyro sensor 338 sense/record operator commands via the interaction of the operator directly with the housing 300 and handle 306 for the probe 106 to control the operation of the ultrasound imaging system 100 based on these commands" Kremsl: [0039]) corresponding to the inertial status of the ultrasound imaging device ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
1) tapping/making a finger tap gesture or action with one or more fingers on or against the probe 106/housing 300/handle 306;
2) shaking the probe 106/housing 300/handle 306;
3) waving the probe 106/housing 300/handle 306 in a particular direction or directions;
4) rotating the probe 106/housing 300/handle 306 in a particular direction; and/or
5) inverting the probe 106/housing 300/handle 306, such as for a specified time period" Kremsl: [0042]-[0048]).
Regarding claim 24, the combination of Kremsl and Shoudy discloses:
The storage media of claim 23, as described above.
Kremsl further discloses:
wherein the inertial status of the ultrasound imaging device includes information based on at least one of whether the ultrasound imaging device is stationary, an angular tilt of the ultrasound imaging device with respect to a skin surface of a body being examined, an angular velocity of the ultrasound imaging device with respect to the skin, a position of the ultrasound imaging device on the skin, or a linear velocity of the ultrasound imaging device ("operator interactions that can be sensed by the motion sensors 310, e.g., the accelerometer 336 and/or the gyro sensor 338, and translated into sensor data/operator commands can be performed by:
1) tapping/making a finger tap gesture or action with one or more fingers on or against the probe 106/housing 300/handle 306;
2) shaking the probe 106/housing 300/handle 306;
3) waving the probe 106/housing 300/handle 306 in a particular direction or directions;
4) rotating the probe 106/housing 300/handle 306 in a particular direction; and/or
5) inverting the probe 106/housing 300/handle 306, such as for a specified time period" Kremsl: [0042]-[0048]).
Claims 10-11 are rejected under 35 U.S.C. 103 as being unpatentable over Kremsl in view of Shoudy as applied to claim 1 above and further in view of Buras et al. (US 2021/0327304, hereinafter “Buras”).
Regarding claim 10, the combination of Kremsl and Shoudy discloses:
The ultrasound imaging device of claim 1, as described above.
Kremsl further discloses:
the sensor circuitry to sense a pattern of sensor inputs including the haptic input, each pattern of sensor input associated with a corresponding one of a plurality of ultrasound exam functions ("operator interaction can initiate a single event or a series of actions by the imaging system 100 or the probe 106. For example:
1) when an operator taps 2 times on the probe 106/housing 300/handle 306, the imaging system 100/probe 106 increases image depth;
2) when an operator taps 3 times on the probe 106/housing 300/handle 306, the imaging system 100/probe 106 decreases image depth" Kremsl: [0055]-[0057]).
Kremsl is not relied on for teaching:
the sensor circuitry to sense a pattern of sensor inputs including, in a predetermined order, two or more of: the haptic input, an eye tracking input or a voice command input, each pattern of sensor input associated with a corresponding one of a plurality of ultrasound exam functions.
However, in a similar invention in the same field of endeavor, Buras further teaches the sensor circuitry to sense a pattern of sensor inputs including, in a predetermined order, two or more of: the haptic input, an eye tracking input or a voice command input, each pattern of sensor input associated with a corresponding one of a plurality of ultrasound exam functions (see “voice command”, “track an eye movement”, and “hand movement” Buras: [0287]-[0290]; also see “an input selected from a first voice command, a first eye movement, a first-hand movement, or two or more thereof; wherein the modifying is based on a second voice command, a second eye movement, a second hand movement, or two or more thereof.” Buras: claim 6).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the ultrasound imaging system disclosed by Kremsl, by including in a predetermined order, two or more of: the haptic input, an eye tracking input or a voice command input as taught by Buras. One of ordinary skill in the art would have been motivated to make this modification in order to provide various means of user input for different controls which would provide ease of use (Buras: [0289]-[0291]).
Regarding claim 11, the combination of Kremsl, Shoudy, and Buras discloses:
The ultrasound imaging device of claim 10, as described above.
Kremsl further discloses:
wherein the ultrasound exam functions include at least one of freezing and unfreezing the ultrasound image ("2) freeze/pause the current image obtained by the probe 106 and being presented on the display 126 if the probe 106 is currently actively scanning;
3) unfreeze/unpauses the probe 106 currently in freeze/pause mode" Kremsl: [0050]-[0051]), saving the ultrasound image, taking a snapshot of the ultrasound image, starting and stopping a recording of ultrasound video, adjusting a depth of the ultrasound image or adjusting a gain of the ultrasound image.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMINAH ASGHAR whose telephone number is (571)272-0527. The examiner can normally be reached M-W, F 9am-5pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.A./Examiner, Art Unit 3797
/CHRISTOPHER KOHARSKI/Supervisory Patent Examiner, Art Unit 3797