DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dunbar et al. (US Pub No. 2011/0196235) in view of Rothberg et al. (US Pub No. 2017/0360401) and Wimer (US Pub No. 2014/0249405), alone, or alternatively, in view of Lachaine et al. (US Pub No. 2009/0024030).
With regards to claim 1, Dunbar et al. disclose a method comprising:
determining, by a host device (8, 9), a predetermined path of an ultrasound device including a first tilt from which a target anatomical view (i.e. anatomical view corresponding to the target) is available and a second tilt from which the target anatomical view is not available (i.e. anatomical view corresponding to points other than the target, such as the landmark position) (paragraphs [0046]-[0047], referring to the processing unit (9) generating on a video display (8) primary and second demonstration video clips; paragraph [0050], referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark (i.e. associated with the second position/orientation/motion) to a target (i.e. associated with the target anatomical view); paragraphs [0016], [0020]-[0022], referring to the demonstration video clip allowing a user to see the position, orientation and motion of the probe to proceed from a landmark to a target feature, wherein such a position/orientation/motion at one instance would inherently correspond to a particular tilt of the probe with respect to at least one surface of the body at that instance; Figures 1 and 4, in particular, note in Figure 4 that the ultrasound probe in the demonstration clips (27, 28) is depicted with a tilt with respect to a surface of the body);
instructing, by the host device, an operator to move an ultrasound device along the predetermined path by displaying a display (8) on the host device (paragraphs [0046]-[0047], referring to the processing unit (9) generating on a video display (8) primary and second demonstration video clips; paragraph [0050], referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark to a target; paragraphs [0016], [0020]-[0022], referring to the demonstration video clip allowing a user to see the motion of the probe to proceed from a landmark to a target feature, wherein the demonstration video clip is pre-recorded and thus provide an operator instructions for moving an ultrasound device along a predetermined path relative to an anatomical area; Figures 1 and 4);
receiving a first ultrasound image depicting the target anatomical view and a second ultrasound image not depicting the target anatomical view based on ultrasound data collected by the ultrasound device while moving along the predetermined path (paragraphs [0009], [0042], [0047], referring to obtaining a live ultrasound image (7) of an imaging slice defined by the position and orientation of the probe; paragraphs [0020]-[0021], referring to, based on the demonstration video clip, etc., the user seeing what motions helps in a particular procedural step to proceed from a landmark to a target feature, thus facilitating proper manipulation of the probe, and therefore the live ultrasound images acquired during the motion/manipulation of the probe from a landmark to a target feature would provide the first ultrasound image (i.e. live image associated with the target feature) and the second ultrasound image (i.e. live image associated with the landmark); Figure 4); and
identifying that the first ultrasound image depicts the target anatomical view (paragraphs [0017]-[0018], [0021], referring to the user seeing what motion helps in a particular procedural step to proceed from a landmark to a target feature based on the synchronized presentation of the demonstration video clip and the ultrasound image video clip and referring to presenting an ultrasound image video clip simultaneously with the primary demonstration video clip and the live ultrasound image which “allows the user to more naturally grasp the relationship between the manipulation of the scanning probe and the ultrasound image. Hence, this embodiment…can help the user to reproduce the procedural steps and interpret the live image…the user can on one hand compare the live ultrasound image with the ultrasound image video clip…”, wherein the comparison of the live ultrasound image (i.e. first ultrasound image) with the ultrasound image video clip to proceed from the landmark to a target feature would require identifying that the live image depicts the target anatomical view depicted in the ultrasound image video clip corresponding to the target feature; Figure 4), wherein
the predetermined path is determined prior to collecting the first ultrasound image and the second ultrasound image (paragraphs [0016]-[0017], referring to the position/motion (i.e. path) of the probe being provided in pre-recorded video form and thus is determined prior to collecting the live image (i.e. first and second ultrasound images)),
the display includes a predetermined image or a predetermined video depicting (paragraphs [0016]-[0017], [0049], referring to the demonstration video clip (27, 28) retrieved from memory, and thus is a “predetermined” video); Figure 4):
the anatomical area (paragraphs [0049]-[0050], referring to the anatomic
model of a human being shown in the primary and secondary demonstration video clips (27, 28); Figure 4);
the predetermined path adjacent to the anatomical area depicted within
the predetermined image or the predetermined video, wherein an image of the predetermined path is superimposed on the predetermined image or the predetermined video (paragraphs [0050], referring to the primary and secondary demonstration video clips showing the movement (i.e. predetermined path) of the probe (i.e. for example, referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark (i.e. associated with the second position/orientation/motion) to a target (i.e. associated with the target anatomical view), and thus an image of the predetermined path is superimposed on the predetermined video of the anatomical area via the depiction of the movement of the probe (3) from a landmark to the target, wherein such movement would define the “predetermined path”; paragraph [0011], referring to a demonstration video clip being an animated sequence of images demonstrating the performance of a step in the examination procedure, which instructs the user as to how he should perform a certain step of the procedure; Figure 4); and
the ultrasound device at a particular tilt along the predetermined path (see
Figure 4, note that at one instance the ultrasound device is depicted at a particular tilt/orientation along the predetermined path (i.e. series of movements of the probe shown in the video clip (27, 28) corresponding to movement from the landmark to target feature).
However, Dunbar et al. do not specifically disclose that the predetermined path has a pivot of fewer than 180 degrees about an anatomical area, that the instructed motion by the host device specifically provides a “pivot” instruction, that the movement along the predetermined path is specifically a “pivoting” motion.
Further, Dunbar et al. do not disclose that the anatomical area comprises a pelvis or a portion of the pelvis.
Rothberg et al. disclose using a guide path to generate a sequence of instructions to provide to the operator in order to guide the operator to move the ultrasound device from an initial position to a target position, thereby allowing operators with little or no experience operating ultrasound devices to capture medically relevant ultrasound images and/or interpret the contents of the obtained ultrasound images (Abstract; paragraphs [0189]-[0192]). The set of instructions provided to an operator to properly position the ultrasound device may include (1) tilt the ultrasound device inferomedially, (2) rotate the ultrasound device counterclockwise, (3) rotate the ultrasound device clockwise, (4) move the ultrasound device one intercostal space down, (5) move the ultrasound device one intercostal space up, and (6) slide the ultrasound device medially (paragraph [0269], note that instructions such as “tilt the ultrasound device inferomedially” and rotate/move/slide the ultrasound device would provide a path for the ultrasound device that has a pivot/tilt of fewer than 180 degrees about an anatomical area, wherein an instruction to “tilt” or “rotate” the ultrasound device would provide instructions for the operator to “pivot” an ultrasound device (i.e. note that tilting corresponds with pivoting around one axis and rotating corresponds to pivoting around another axis).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the predetermined path of Dunbar et al. have a pivot of fewer than 180 degrees about an anatomical area, have the instructed motion by the host device specifically provide a “pivot” instruction and have the movement along the predetermined path be specifically a “pivoting” motion, as taught by Rothberg et al., in order to allow operators with little or no experience operating ultrasound devices to capture medically relevant ultrasound images with fine movements/instructions and/or interpret the contents of the obtained ultrasound images (Abstract; paragraphs [0189]-[0192]).
However, the above combined references do not disclose that the anatomical area comprises a pelvis or a portion of the pelvis.
Wimer disclose a user interface to provide guidance for injection and other procedures under ultrasound imaging (Abstract; paragraphs [0009], [0011]). In a muscoskeletal application in which ultrasound imaging is utilized, a menu selection for the desired anatomical location is made and an image on a display screen shows proper placement of the transducer and what an ultrasound image should look like, all of which assist the practitioner with effecting an optimal treatment for the subject/patient (paragraph [0011]). Target locations and treatments may include joints, wherein the joints to be targeted may include the hip (paragraph [0016]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the anatomical area of the above combined references be a pelvis or a portion of the pelvis, as taught by Wimer, in order to provide optimal treatment of a hip joint (paragraphs [0011], [0016]).
With regards to the limitation that “an image of the predetermined path is superimposed on the predetermined image or the predetermined video”, if Dunbar is considered to not teach this limitation, alternatively, Lachaine et al. disclose a method of presenting a suggested path for an ultrasound probe along a patient’s surface, wherein a location-guidance image (125) is superimposed with a suggested path (100), thereby enabling the operator to include the appropriate anatomy in the scan with minimal searching and allows the operator to reproduce the same scan from one session to the next (paragraphs [0012], [0015], [0021]; claim 7; Figures 2-3). The intended scanning path (100) can be shown as a line and can also denote a suggested direction of travel for the probe along the line (paragraph [0022]). In some embodiments, a movie loop may be compiled showing an actual representation of the suggested probe motion along the intended scan path, indicating a more detailed version of the motion (paragraph [0022]; Figures 2-3).
Therefore, alternatively, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have an image of the predetermined path be superimposed on the predetermined image or the predetermined video, as taught by Lachaine et al., in order to enable the operator to include the appropriate anatomy in the scan with minimal searching and allow the operator to reproduce the same scan from one session to the next (paragraph [0021]).
With regards to claim 2, Dunbar et al. disclose that identifying that the first ultrasound image depicts the target anatomical view occurs after the ultrasound device pivots along the predetermined path (paragraphs [0018], [0021], [0052], referring to the user, as they are manipulating the scanning probe, can compare the live ultrasound image and the ultrasound image video clip (i.e. which would include a pivot motion in the above combination of references) in order to reproduce the procedural steps for proceeding from a landmark to a target feature, and therefore it is inherent that the user would identify the first ultrasound image (i.e. live image associated with target feature) as depicting the target anatomical view after the user manipulates the probe to follow the predetermined path (i.e. motion of the probe in the demonstration video clip, which includes a pivot instruction as set forth in the above combination of references) since the user would want to pause the live video frame after the probe has been manipulated to follow the predetermined path that leads to acquiring the target feature).
With regards to claim 3, Dunbar et al. disclose that identifying that the first ultrasound image depicts the target anatomical view occurs while the ultrasound device pivots along the predetermined path (paragraphs [0018], [0021], [0052], referring to the user, as they are manipulating the scanning probe, can compare the live ultrasound image and the ultrasound image video clip (i.e. which would include a pivot motion in the above combination of references) in order to reproduce the procedural steps for proceeding from a landmark to a target feature, and therefore it is inherent that the user would identify the first ultrasound image (i.e. live image associated with target feature) as depicting the target anatomical while the user manipulates/handles the probe to follow the predetermined path (i.e. motion of the probe in the demonstration video clip), which includes a pivot instruction as set forth in the above combination of references) since the ultrasound probe is acquiring the live images while it is being moved by the user).
With regards to claim 4, as discussed above, the above combined references meet the limitations of claim 1. However, they do not specifically disclose that their method further comprises saving the first ultrasound image to a memory, based on identifying that the first ultrasound image depicts the target anatomical view. Rothberg et al. disclose that, based on identifying that the first ultrasound image (i.e. captured ultrasound images) depicts the target anatomical view, the first ultrasound image is saved to memory (i.e. 1522) in order add to the medical record of the subject (paragraph [0312]; Figure 15B). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the method of the above combined references further comprise saving the first ultrasound image to a memory, based on identifying that the first ultrasound image depicts the target anatomical view, as taught by Rothberg et al., in order to add to the medical record of the subject (paragraph [0312]). ty
With regards to claim 5, as discussed above, the above combined references meet the limitations of claim 1. Further, Dunbar et al. disclose that determining the predetermined path includes accessing a database including a plurality of paths (i.e. probe motion paths as depicted in demonstration video clips associated with the different procedures) and retrieving the predetermined path based on a selection of a procedure (paragraphs [0047]-[0049], referring to procedure selection screen (15) which includes a plurality of buttons (21) representing different procedures (which are stored in memory (10)) and thus considered to be in a “database” in memory), wherein each selected procedure is associated with a demonstration video clip (27, 28); Figures 2-4). However, Dunbar et al. do not specifically disclose that the selection of a procedure corresponds to a selection of a target anatomical view. Wimer discloses that an operator selects from a menu a treatment target/anatomy of interest and the system then displays the guidance for providing correct transducer placement for achieving an image associated with the treatment target/anatomy of interest (paragraphs [0011], 0013], [0037]; Figure 4). Before the effective filing date, it would have been obvious to one of ordinary skill in the art to have the selection of the procedure of Dunbar et al. correspond to a selection of a target anatomical view, as taught by Wimer, in order to assist the operator with effecting an optimal treatment for the subject/patient for different treatment target anatomies (paragraphs [0011], [0013]).
With regards to claim 6, Dunbar et al. disclose that the database further includes the predetermined image or the predetermined video (paragraphs [0016], [0049], referring to the demonstration video clip being pre-recorded and stored in memory and retrieved from the memory (10)).
With regards to claim 7, Dunbar et al. disclose that instructing the operator includes instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path (paragraph [0016], referring to the demonstration video clips showing the position, orientation and motion of a probe of the ultrasound scanner relative to the patient’s anatomy, wherein, as shown in Figure 4, the demonstration videos (27, 28) depict that the probe is in contact with a point on the predetermined path, which would correspond to a pivot point in the above combined references). Alternatively, Rothberg discloses that instructions are provided for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path (paragraph [0269]; also see Figure 8D, wherein the displayed instructions to rotate would also provide instructions that contact should be maintained at the pivot/rotation point). Therefore, if Dunbar et al. is viewed as not teaching instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path, alternatively, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify the method of the above combined references to provide instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path, as taught by Rothberg et al., in order to successfully capture an ultrasound image of the subject that contains a particular anatomical view (Abstract; paragraph [0269]).
With regards to claim 8, as discussed above, the above combined references meet the limitations of claim 1. However, they do not specifically disclose that a machine learning technique is used to identify that the first ultrasound image depicts the target anatomical view. Rothberg et al. disclose using a machine learning technique (i.e. neural network) to identify that the first ultrasound image depicts the target anatomical view (paragraph [0312], paragraphs [0149]-[0150], [0228], referring to inputting the ultrasound image to a neural network that is trained to identify an anatomical view contained in the ultrasound image), thereby providing an automated technique to determine whether an ultrasound image contains the target anatomical view (paragraph [0149]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the method of the above combined references comprise a machine learning technique used to identify that the first ultrasound image depicts the target anatomical view, as taught by Rothberg et al., in order to provide an automated technique to determine whether an ultrasound image contains the target anatomical view (paragraph [0149]).
With regards to claim 9, Rothberg et al. disclose that the machine learning technique includes using a multi-layer convolutional neural network to identify that the first ultrasound image depicts the target anatomical view (paragraph [0009], referring to the multi-layer neural network).
With regards to claim 10, as discussed above, the above combined references meet the limitations of claim 1. However, they do not specifically disclose that the host device is a mobile smartphone, a tablet, a laptop, a smart watch, a virtual reality headset, an augmented reality headset or a smart wearable device. Rothberg et al. disclose that guidance may be provided via a software application (i.e. APP) installed on a computing device, such as a mobile device, a smartphone, or smart-device, tablet, etc., of a subject at-home (paragraphs [0144], [0218]). This allows a physician to be able to remotely monitor a condition of the subject without making the subject remain inpatient care (paragraph [0218]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the host device be a mobile smartphone, a tablet, etc., as taught by Rothberg et al., in order to allow a physician to be able to remotely monitor a condition of the subject without making the subject remain inpatient care (paragraph [0218]).
With regards to claim 11, Dunbar et al. disclose that the predetermined image or the predetermined video further depict text instructing the operator to pivot the ultrasound device along the predetermined path (paragraph [0027], referring to the video display further comprising text which explains in more detail how to handle the probe, which in the above combined references would include instructions to pivot the ultrasound device).
With regards to claim 12, as discussed above, the above combined references meet the limitations of claim 1. However, the above combined references do not specifically disclose that the ultrasound device generates orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer, and the host device: determines a detected orientation of the ultrasound device based on the orientation data; and provides an instruction for pivoting the ultrasound device based on the detected orientation. Rothberg et al. disclose that the ultrasound device generates orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer (paragraph [0236], referring to the ultrasound device comprising sensors, such as accelerometers, gyroscopes, etc., to detect movement information, wherein the movement information is used to determine the pose of the ultrasound device), and the host device: determines a detected orientation of the ultrasound device based on the orientation data (paragraph [0236], referring to using the movement information from the sensors to determine the pose (i.e. position and orientation) of the ultrasound device); and provides an instruction for pivoting the ultrasound device based on the detected orientation (paragraph [0164], referring to the pose of the ultrasound device being used to overlay an instruction onto at least part of the ultrasound device in the captured image; paragraphs [0153], [0161]). The pose may be employed to position elements in an augmented reality interface, thus allowing the operator to gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject (paragraphs [0164]-[0165], [0237]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the ultrasound device generate orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer, and the host device determine a detected orientation of the ultrasound device based on the orientation data and provide an instruction for pivoting the ultrasound device based on the detected orientation, as taught by Rothberg et al., in order to position elements in an augmented reality interface, thus allowing the operator to gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject (paragraphs [0164]-[0165], [0237]).
With regards to claim 13, Rothberg et al. disclose that the instruction for pivoting the ultrasound device based on the detected orientation includes, in the display showing the ultrasound device at the particular tilt along the predetermined path, a directional indicator (i.e. “directional arrow”) on an image of the ultrasound device (paragraphs [0237], [0244], referring to the computing device overlaying an instruction regarding how to move the ultrasound device (e.g., directional arrow) onto the image; paragraphs [0153], [0161], Figures 3B, 5B, 7E, 8D).
With regards to claim 14, Rothberg et al. disclose that the instruction for pivoting the ultrasound device based on the detected orientation includes a first instruction for the operator to initially position the ultrasound device at the particular tilt on the predetermined path (paragraphs [0153], [0161], paragraphs [0185]-[0186], referring to instructions being provide from an initial position to a target position, wherein the instructions are provided serially; paragraph [0269], referring to the set of instructions including “(1) tilt the ultrasound device inferomedially”, which corresponds to a first instruction to position the ultrasound device at a particular tilt; paragraphs [0191]-[0195], referring to the coarse instruction to provide an instruction of where the operator should position the ultrasound device); and a second instruction for the operator to follow the predetermined path (paragraphs [0153], [0161]; paragraphs [0185]-[0186], referring to instructions being provide from an initial position to a target position, wherein the instructions are provided serially, and thus include a second instruction; paragraph [0269], referring to the set of instructions including “(1) tilt the ultrasound device inferomedially, (2) rotate the ultrasound device counterclockwise”, etc., and therefore there exists second instructions for the operator to follow the predetermined path; paragraphs [0191]-[0195], referring to the fine instructions for providing a direction to move the ultrasound device).
With regards to claim 15, Rothberg et al. disclose that the second instruction to follow the predetermine path is provided after the host device determines that the ultrasound device has reached the particular tilt based on the orientation data (paragraphs [0153], [0161], referring to the instructions being determined by comparing the current positioning of the ultrasound device relative to one or more prior positions of the ultrasound device which yielded the target ultrasound image; paragraphs [0191]-[0195], referring to the fine instruction being provided when the ultrasound device is positioned properly according to the coarse instructions; paragraph [0186], referring to the instructions being provided “serially”/one at a time).
Claim(s) 16-28 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dunbar et al. in view of Rothberg et al., alone, or alternatively, in view of Lachaine et al..
With regards to claim 16, Dunbar et al. disclose a host device, comprising:
a memory (10), a display screen (8); and a processor (6, 9) (paragraphs [0045]-[0047]; Figure 1);
wherein the processor (6, 9; Figures 1,4) is configured to:
determine a predetermined path including a first tilt from which a target anatomical view (i.e. anatomical view corresponding to the target) is available and a second tilt from which the target anatomical view is not available (i.e. anatomical view corresponding to points other than the target, such as the landmark position) (paragraphs [0046]-[0047], referring to the processing unit (9) generating on a video display (8) primary and second demonstration video clips; paragraph [0050], referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark (i.e. associated with the second position/orientation/motion) to a target (i.e. associated with the target anatomical view); paragraphs [0016], [0020]-[0022], referring to the demonstration video clip allowing a user to see the position, orientation and motion of the probe to proceed from a landmark to a target feature, wherein such a position/orientation/motion at one instance would inherently correspond to a particular tilt of the probe with respect to at least one surface of the body at that instance; Figures 1 and 4, in particular, note in Figure 4 that the ultrasound probe in the demonstration clips (27, 28) is depicted with a tilt with respect to a surface of the body);
instructing an operator to move an ultrasound device along the predetermined path by displaying a display (8) on the display screen (paragraphs [0046]-[0047], referring to the processing unit (9) generating on a video display (8) primary and second demonstration video clips; paragraph [0050], referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark to a target; paragraphs [0016], [0020]-[0022], referring to the demonstration video clip allowing a user to see the motion of the probe to proceed from a landmark to a target feature, wherein the demonstration video clip is pre-recorded and thus provide an operator instructions for moving an ultrasound device along a predetermined path relative to an anatomical area; Figures 1 and 4);
receive a first ultrasound image depicting the target anatomical view and a second ultrasound image not depicting the target anatomical view based on ultrasound data collected by the ultrasound device while moving along the predetermined path (paragraphs [0009], [0042], [0047], referring to obtaining a live ultrasound image (7) of an imaging slice defined by the position and orientation of the probe; paragraphs [0020]-[0021], referring to, based on the demonstration video clip, etc., the user seeing what motions helps in a particular procedural step to proceed from a landmark to a target feature, thus facilitating proper manipulation of the probe, and therefore the live ultrasound images acquired during the motion/manipulation of the probe from a landmark to a target feature would provide the first ultrasound image (i.e. live image associated with the target feature) and the second ultrasound image (i.e. live image associated with the landmark); Figure 4); and
identify that the first ultrasound image depicts the target anatomical view (paragraphs [0017]-[0018], [0021], referring to the user seeing what motion helps in a particular procedural step to proceed from a landmark to a target feature based on the synchronized presentation of the demonstration video clip and the ultrasound image video clip and referring to presenting an ultrasound image video clip simultaneously with the primary demonstration video clip and the live ultrasound image which “allows the user to more naturally grasp the relationship between the manipulation of the scanning probe and the ultrasound image. Hence, this embodiment…can help the user to reproduce the procedural steps and interpret the live image…the user can on one hand compare the live ultrasound image with the ultrasound image video clip…”, wherein the comparison of the live ultrasound image (i.e. first ultrasound image) with the ultrasound image video clip to proceed from the landmark to a target feature would require identifying that the live image depicts the target anatomical view depicted in the ultrasound image video clip corresponding to the target feature; Figure 4), wherein
the predetermined path is determined prior to collection of the first ultrasound image and the second ultrasound image (paragraphs [0016]-[0017], referring to the position/motion (i.e. path) of the probe being provided in pre-recorded video form and thus is determined prior to collecting the live image (i.e. first and second ultrasound images)),
the display includes a predetermined image or a predetermined video depicting (paragraphs [0016]-[0017], [0049], referring to the demonstration video clip (27, 28) retrieved from memory, and thus is a “predetermined” video); Figure 4):
the anatomical area (paragraphs [0049]-[0050], referring to the anatomic model of a human being shown in the primary and secondary demonstration video clips (27, 28); Figure 4);
the predetermined path adjacent to the anatomical area depicted within the predetermined image or the predetermined video, wherein an image of the predetermined path is superimposed on the predetermined image or the predetermined video (paragraphs [0050], referring to the primary and secondary demonstration video clips showing the movement (i.e. predetermined path) of the probe (i.e. for example, referring to the primary demonstration video clip (27) showing the movement of the probe (3) from a landmark (i.e. associated with the second position/orientation/motion) to a target (i.e. associated with the target anatomical view), and thus an image of the predetermined path is superimposed on the predetermined video of the anatomical area via the depiction of the movement of the probe (3) from a landmark to the target, wherein such movement would define the “predetermined path”; paragraph [0011], referring to a demonstration video clip being an animated sequence of images demonstrating the performance of a step in the examination procedure, which instructs the user as to how he should perform a certain step of the procedure; Figure 4); and
the ultrasound device at a particular tilt along the predetermined path (see Figure 4, note that at one instance the ultrasound device is depicted at a particular tilt/orientation along the predetermined path (i.e. series of movements of the probe shown in the video clip (27, 28) corresponding to movement from the landmark to target feature); and
text instructing the operator to move the ultrasound device along the predetermined path (paragraphs [0027], [0050], referring to written information complementing the graphical information provided to the user, wherein text is displayed on the video display in one or more text fields, wherein the text may explain in more detail how to handle the probe of the ultrasound scanner; Figure 4).
However, Dunbar et al. do not specifically disclose that the predetermined path has a pivot of fewer than 180 degrees about an anatomical area, that the instructed motion by the host device specifically provides a “pivot” instruction, that the movement along the predetermined path is specifically a “pivoting” motion.
Rothberg et al. disclose using a guide path to generate a sequence of instructions to provide to the operator in order to guide the operator to move the ultrasound device from an initial position to a target position, thereby allowing operators with little or no experience operating ultrasound devices to capture medically relevant ultrasound images and/or interpret the contents of the obtained ultrasound images (Abstract; paragraphs [0189]-[0192]). The set of instructions provided to an operator to properly position the ultrasound device may include (1) tilt the ultrasound device inferomedially, (2) rotate the ultrasound device counterclockwise, (3) rotate the ultrasound device clockwise, (4) move the ultrasound device one intercostal space down, (5) move the ultrasound device one intercostal space up, and (6) slide the ultrasound device medially (paragraph [0269], note that instructions such as “tilt the ultrasound device inferomedially” and rotate/move/slide the ultrasound device would provide a path for the ultrasound device that has a pivot/tilt of fewer than 180 degrees about an anatomical area, wherein an instruction to “tilt” or “rotate” the ultrasound device would provide instructions for the operator to “pivot” an ultrasound device (i.e. note that tilting corresponds with pivoting around one axis and rotating corresponds to pivoting around another axis).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the predetermined path of Dunbar et al. have a pivot of fewer than 180 degrees about an anatomical area, have the instructed motion by the host device specifically provide a “pivot” instruction and have the movement along the predetermined path be specifically a “pivoting” motion, as taught by Rothberg et al., in order to allow operators with little or no experience operating ultrasound devices to capture medically relevant ultrasound images with fine movements/instructions and/or interpret the contents of the obtained ultrasound images (Abstract; paragraphs [0189]-[0192]).
With regards to the limitation that “an image of the predetermined path is superimposed on the predetermined image or the predetermined video”, if Dunbar is considered to not teach this limitation, alternatively, Lachaine et al. disclose a method of presenting a suggested path for an ultrasound probe along a patient’s surface, wherein a location-guidance image (125) is superimposed with a suggested path (100), thereby enabling the operator to include the appropriate anatomy in the scan with minimal searching and allows the operator to reproduce the same scan from one session to the next (paragraphs [0012], [0015], [0021]; claim 7; Figures 2-3). The intended scanning path (100) can be shown as a line and can also denote a suggested direction of travel for the probe along the line (paragraph [0022]). In some embodiments, a movie loop may be compiled showing an actual representation of the suggested probe motion along the intended scan path, indicating a more detailed version of the motion (paragraph [0022]; Figures 2-3).
Therefore, alternatively, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have an image of the predetermined path be superimposed on the predetermined image or the predetermined video, as taught by Lachaine et al., in order to enable the operator to include the appropriate anatomy in the scan with minimal searching and allow the operator to reproduce the same scan from one session to the next (paragraph [0021]).
With regards to claim 17, Dunbar et al. disclose that identifying that the first ultrasound image depicts the target anatomical view occurs after the ultrasound device pivots along the predetermined path (paragraphs [0018], [0021], [0052], referring to the user, as they are manipulating the scanning probe, can compare the live ultrasound image and the ultrasound image video clip (i.e. which would include a pivot motion in the above combination of references) in order to reproduce the procedural steps for proceeding from a landmark to a target feature, and therefore it is inherent that the user would identify the first ultrasound image (i.e. live image associated with target feature) as depicting the target anatomical view after the user manipulates the probe to follow the predetermined path (i.e. motion of the probe in the demonstration video clip, which includes a pivot instruction as set forth in the above combination of references) since the user would want to pause the live video frame after the probe has been manipulated to follow the predetermined path that leads to acquiring the target feature).
With regards to claim 18, Dunbar et al. disclose that identifying that the first ultrasound image depicts the target anatomical view occurs while the ultrasound device pivots along the predetermined path (paragraphs [0018], [0021], [0052], referring to the user, as they are manipulating the scanning probe, can compare the live ultrasound image and the ultrasound image video clip (i.e. which would include a pivot motion in the above combination of references) in order to reproduce the procedural steps for proceeding from a landmark to a target feature, and therefore it is inherent that the user would identify the first ultrasound image (i.e. live image associated with target feature) as depicting the target anatomical while the user manipulates/handles the probe to follow the predetermined path (i.e. motion of the probe in the demonstration video clip), which includes a pivot instruction as set forth in the above combination of references) since the ultrasound probe is acquiring the live images while it is being moved by the user).
With regards to claim 19, as discussed above, the above combined references meet the limitations of claim 16. However, they do not specifically disclose that their method further comprises saving the first ultrasound image to a memory, based on identifying that the first ultrasound image depicts the target anatomical view. Rothberg et al. disclose that, based on identifying that the first ultrasound image (i.e. captured ultrasound images) depicts the target anatomical view, the first ultrasound image is saved to memory (i.e. 1522) in order add to the medical record of the subject (paragraph [0312]; Figure 15B). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the method of the above combined references further comprise saving the first ultrasound image to a memory, based on identifying that the first ultrasound image depicts the target anatomical view, as taught by Rothberg et al., in order to add to the medical record of the subject (paragraph [0312]).
With regards to claim 20, as discussed above, the above combined references meet the limitations of claim 16. Further, Dunbar et al. disclose that determining the predetermined path includes accessing a database including a plurality of paths (i.e. probe motion paths as depicted in demonstration video clips associated with the different procedures) and retrieving the predetermined path based on a selection of a procedure (paragraphs [0047]-[0049], referring to procedure selection screen (15) which includes a plurality of buttons (21) representing different procedures (which are stored in memory (10)) and thus considered to be in a “database” in memory), wherein each selected procedure is associated with a demonstration video clip (27, 28); Figures 2-4). However, Dunbar et al. do not specifically disclose that the selection of a procedure corresponds to a selection of a target anatomical view. Wimer discloses that an operator selects from a menu a treatment target/anatomy of interest and the system then displays the guidance for providing correct transducer placement for achieving an image associated with the treatment target/anatomy of interest (paragraphs [0011], 0013], [0037]; Figure 4). Before the effective filing date, it would have been obvious to one of ordinary skill in the art to have the selection of the procedure of Dunbar et al. correspond to a selection of a target anatomical view, as taught by Wimer, in order to assist the operator with effecting an optimal treatment for the subject/patient for different treatment target anatomies (paragraphs [0011], [0013]).
With regards to claim 21, Dunbar et al. disclose that the database further includes the predetermined image or the predetermined video (paragraphs [0016], [0049], referring to the demonstration video clip being pre-recorded and stored in memory and retrieved from the memory (10)).
With regards to claim 22, Dunbar et al. disclose that instructing the operator includes instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path (paragraph [0016], referring to the demonstration video clips showing the position, orientation and motion of a probe of the ultrasound scanner relative to the patient’s anatomy, wherein, as shown in Figure 4, the demonstration videos (27, 28) depict that the probe is in contact with a point on the predetermined path, which would correspond to a pivot point in the above combined references). Alternatively, Rothberg discloses that instructions are provided for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path (paragraph [0269]; also see Figure 8D, wherein the displayed instructions to rotate would also provide instructions that contact should be maintained at the pivot/rotation point). Therefore, if Dunbar et al. is viewed as not teaching instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path, alternatively, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify the method of the above combined references to provide instructions for the operator to maintain contact between at least a portion of a sensor of the ultrasound device and a pivot point on the predetermined path, as taught by Rothberg et al., in order to successfully capture an ultrasound image of the subject that contains a particular anatomical view (Abstract; paragraph [0269]).
With regards to claim 23, as discussed above, the above combined references meet the limitations of claim 16. However, they do not specifically disclose that a machine learning technique is used to identify that the first ultrasound image depicts the target anatomical view. Rothberg et al. disclose using a machine learning technique (i.e. neural network) to identify that the first ultrasound image depicts the target anatomical view (paragraph [0312], paragraphs [0149]-[0150], [0228], referring to inputting the ultrasound image to a neural network that is trained to identify an anatomical view contained in the ultrasound image), thereby providing an automated technique to determine whether an ultrasound image contains the target anatomical view (paragraph [0149]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the method of the above combined references comprise a machine learning technique used to identify that the first ultrasound image depicts the target anatomical view, as taught by Rothberg et al., in order to provide an automated technique to determine whether an ultrasound image contains the target anatomical view (paragraph [0149]).
With regards to claim 24, Rothberg et al. disclose that the machine learning technique includes using a multi-layer convolutional neural network to identify that the first ultrasound image depicts the target anatomical view (paragraph [0009], referring to the multi-layer neural network).
With regards to claim 25, as discussed above, the above combined references meet the limitations of claim 16. However, they do not specifically disclose that the host device is a mobile smartphone, a tablet, a laptop, a smart watch, a virtual reality headset, an augmented reality headset or a smart wearable device. Rothberg et al. disclose that guidance may be provided via a software application (i.e. APP) installed on a computing device, such as a mobile device, a smartphone, or smart-device, tablet, etc., of a subject at-home (paragraphs [0144], [0218]). This allows a physician to be able to remotely monitor a condition of the subject without making the subject remain inpatient care (paragraph [0218]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the host device be a mobile smartphone, a tablet, etc., as taught by Rothberg et al., in order to allow a physician to be able to remotely monitor a condition of the subject without making the subject remain inpatient care (paragraph [0218]).
With regards to claim 26, Dunbar et al. disclose that the predetermined image or the predetermined video further depict text instructing the operator to pivot the ultrasound device along the predetermined path (paragraph [0027], referring to the video display further comprising text which explains in more detail how to handle the probe, which in the above combined references would include instructions to pivot the ultrasound device).
With regards to claim 27, as discussed above, the above combined references meet the limitations of claim 16. However, the above combined references do not specifically disclose that the ultrasound device generates orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer, and the host device: determines a detected orientation of the ultrasound device based on the orientation data; and provides an instruction for pivoting the ultrasound device based on the detected orientation. Rothberg et al. disclose that the ultrasound device generates orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer (paragraph [0236], referring to the ultrasound device comprising sensors, such as accelerometers, gyroscopes, etc., to detect movement information, wherein the movement information is used to determine the pose of the ultrasound device), and the host device: determines a detected orientation of the ultrasound device based on the orientation data (paragraph [0236], referring to using the movement information from the sensors to determine the pose (i.e. position and orientation) of the ultrasound device); and provides an instruction for pivoting the ultrasound device based on the detected orientation (paragraph [0164], referring to the pose of the ultrasound device being used to overlay an instruction onto at least part of the ultrasound device in the captured image; paragraphs [0153], [0161]). The pose may be employed to position elements in an augmented reality interface, thus allowing the operator to gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject (paragraphs [0164]-[0165], [0237]). Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the ultrasound device generate orientation data using at least one of an accelerometer, a gyroscope, and a magnetometer, and the host device determine a detected orientation of the ultrasound device based on the orientation data and provide an instruction for pivoting the ultrasound device based on the detected orientation, as taught by Rothberg et al., in order to position elements in an augmented reality interface, thus allowing the operator to gain a better appreciation for the particular region of the subject that is being imaged given the current position of the ultrasound device on the subject (paragraphs [0164]-[0165], [0237]).
With regards to claim 28, Rothberg et al. disclose that the instruction for pivoting the ultrasound device based on the detected orientation includes, in the display showing the ultrasound device at the particular tilt along the predetermined path, a directional indicator (i.e. “directional arrow”) on an image of the ultrasound device (paragraphs [0237], [0244], referring to the computing device overlaying an instruction regarding how to move the ultrasound device (e.g., directional arrow) onto the image; paragraphs [0153], [0161], Figures 3B, 5B, 7E, 8D).
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-28 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-23 of U.S. Patent No. 12,109,066 in view of Lachaine et al..
With regards to instant claims 1 and 16, claims 1 and 14 of the Patent discloses most of the limitations (i.e. host device, determining a predetermined path, instructing an operator to pivot, receiving a first ultrasound image and a second ultrasound image, identifying that the first ultrasound image depicts the target anatomical view, the display, etc.) of instant claims 1 and 16. However, the Patent does not specifically disclose that an image of the predetermined path is superimposed on the predetermined image or the predetermined video. Lachaine et al. disclose a method of presenting a suggested path for an ultrasound probe along a patient’s surface, wherein a location-guidance image (125) is superimposed with a suggested path (100), thereby enabling the operator to include the appropriate anatomy in the scan with minimal searching and allows the operator to reproduce the same scan from one session to the next (paragraphs [0012], [0015], [0021]; claim 7; Figures 2-3). The intended scanning path (100) can be shown as a line and can also denote a suggested direction of travel for the probe along the line (paragraph [0022]). In some embodiments, a movie loop may be compiled showing an actual representation of the suggested probe motion along the intended scan path, indicating a more detailed version of the motion (paragraph [0022]; Figures 2-3). At the time of the invention, it would have been obvious to one of ordinary skill in the art to have an image of the predetermined path of the Patent be superimposed on the predetermined image or the predetermined video, as taught by Lachaine et al., in order to enable the operator to include the appropriate anatomy in the scan with minimal searching and allow the operator to reproduce the same scan from one session to the next (paragraph [0021]).
With regards to instant claim 2, claim 1 of the Patent sets forth the same limitations.
With regards to instant claim 3, claim 1 of the Patent sets forth the same limitations.
With regards to instant claim 4, claim 2 of the Patent sets forth the same limitations.
With regards to instant claim 5, claim 3 of the Patent sets forth the same limitations.
With regards to instant claim 6, claim 4 of the Patent sets forth the same limitations.
With regards to instant claim 7, claim 5 of the Patent sets forth the same limitations.
With regards to instant claim 8, claim 6 of the Patent sets forth the same limitations.
With regards to instant claim 9, claim 7 of the Patent sets forth the same limitations.
With regards to instant claim 10, claim 8 of the Patent sets forth the same limitations.
With regards to instant claim 11, claim 9 of the Patent sets forth the same limitations.
With regards to instant claim 12, claim 10 of the Patent sets forth the same limitations.
With regards to instant claim 13, claim 11 of the Patent sets forth the same limitations.
With regards to instant claim 14, claim 12 of the Patent sets forth the same limitations.
With regards to instant claim 15, claim 13 of the Patent sets forth the same limitations.
With regards to instant claim 17, claim 14 of the Patent sets forth the same limitations.
With regards to instant claim 18, claim 14 of the Patent sets forth the same limitations.
With regards to instant claim 19, claim 15 of the Patent sets forth the same limitations.
With regards to instant claim 20, claim 16 of the Patent sets forth the same limitations.
With regards to instant claim 21, claim 17 of the Patent sets forth the same limitations.
With regards to instant claim 22, claim 18 of the Patent sets forth the same limitations.
With regards to instant claim 23, claim 19 of the Patent sets forth the same limitations.
With regards to instant claim 24, claim 20 of the Patent sets forth the same limitations.
With regards to instant claim 25, claim 21 of the Patent sets forth the same limitations.
With regards to instant claim 26, claim 14 of the Patent sets forth the same limitations.
With regards to instant claim 27, claim 22 of the Patent sets forth the same limitations.
With regards to instant claim 28, claim 23 of the Patent sets forth the same limitations.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-28 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Applicant argues that there is no teaching or suggestion in any of Dunbar, Rothberg and Wimer of displaying a “predetermined path” superimposed on the predetermined image or predetermined video.
Examiner respectfully disagrees and points to paragraphs [0011] and [0050] of Dunbar, specifically referring to the primary and secondary demonstration video clips showing the movement (i.e. predetermined path) of the probe. The primary demonstration video clip (27) is disclosed as showing the movement of the probe (3) from a landmark (i.e. associated with the second position/orientation/motion) to a target (i.e. associated with the target anatomical view), and thus an image of the predetermined path is superimposed on the predetermined video (i.e. “animated sequence of images”) of the anatomical area via the depiction of the movement of the probe (3) from a landmark to the target in the animated sequence of images of the predetermined video, wherein such movement would define the “predetermined path”.
If this is not persuasive, an alternative rejection has been provided wherein Lachaine has been introduced to teach this limitation.
The claims therefore remain rejected.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHERINE L FERNANDEZ whose telephone number is (571)272-1957. The examiner can normally be reached Monday-Friday 9:00 AM - 5:30 PM (ET).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KATHERINE L FERNANDEZ/Primary Examiner, Art Unit 3798