Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Response to Arguments
Regarding objections.
Applicant argues:
The applicant has amended the title of the application to recite "IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM FOR DERIVING SIDE VIEWPOINT IMAGE ACCORDING TO CUT SECTION" (underlined marks added for indicating the amendments).
It is respectfully submitted that the currently presented title is descriptive and is clearly indicative of the invention to which the claims are directed.
Examiner replies that:
Withdrawn.
Regarding 35 USC § 102/103.
Applicant argues:
Fig. 8 of Coustaud illustrates grasping and removing of a 2D image slice 810 from a 3D image volume 820 using the hand avatar 220 in the virtual environment 200. The position where the user pulls the image slice 810 can be interpreted as the claimed cut section. Additionally, the angle of viewing can be adjusted, and the 3D image volume can be rotated. Accordingly, after the 3D image volume (the claimed "first display image", i.e., the claimed "side viewpoint image") is rotated, the user can pull another 2D image slice, rather than the 2D image slice 810, corresponding to the other angle (i.e., the claimed side viewpoint).
However, the claimed "side viewpoint image" differs from the 3D image volume of Coustaud. As clarified in the claim, the claimed "side viewpoint image" shows a part of the organ and at least one target tissue internal the organ. In contrast, referring to Coustaud's col. 9, lines 10-30, and Fig. 8, Coustaud merely displays the 3D image volume showing the entire organ and pulls a 2D image slice from the 3D image volume. Coustaud does not teach or suggest outputting the 3D image volume showing both a part of the organ and the target issue inside the organ as claimed.
Thus, Coustaud does not disclose the features "output, as a first display image that is generated by rendering based on the three-dimensional image and shows the organ, a side viewpoint image having a side viewpoint at which the set cut section is viewed from a direction intersecting a normal line of the cut section, set as a viewpoint of the rendering, wherein the side viewpoint image shows a part of the organ and at least one target tissue internal the organ" in claim 1.
Examiner replies that:
Applicants arguments are not found persuasive. Applicant argues the side viewpoint does not show a part of the organ and internal target tissue. Coustaud provides for a number of views, including multiple slices. A side view of the volume (Coustaud C4 L13-20) or the side view of a slice (Coustaud C8 L10-25) can be viewed by the user, where the user can also adjust masks/opacity to view internal target tissue (Coustaud C6 L1-5)(Coustaud C8 L40-50). See the rejection below for a more detailed mapping.
While Coustaud does allow for creating multiple slices and looking at different views, examiner agrees there are differences between Coustaud and Applicants drawings Fig. 5-7 which show the claimed side viewpoint with multiple 2D viewports being automatically generated based on the slice position. However further clarification would be necessary to overcome Coustaud.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-11, 14-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Coustaud U.S. Patent/PG Publication 11062527.
Regarding claim 1 (independent):
An image processing device comprising: a processor, (Coustaud An example medical image visualization and interaction apparatus includes at least one processor and at least one memory including instructions. The instructions, when executed, cause the at least one processor to at least: generate a virtual environment for display of image content via a virtual reality display device)
wherein the processor is configured to receive a setting of a cut section with respect to an organ shown by a three-dimensional image, (Coustaud C9 L10-30 FIGS. 8-10 illustrate various manipulations of image slices within and outside of a 3D image volume shown in the virtual environment 200. For example, FIG. 8 illustrates grasping and removing of a 2D image slice 810 from a 3D image volume 820 using the hand avatar 220 in the virtual environment 200.)
and output, as a first display image that is generated by rendering based on the three-dimensional image and shows the organ, a side viewpoint image having a side viewpoint at which the set cut section is viewed from a direction intersecting a normal line of the cut section, set as a viewpoint of the rendering (Coustaud C9 L10-30 As shown in the example of FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820. FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.).
wherein the side viewpoint image shows a part of the organ (Coustaud C4 L13-20 The immersive VR environment provides the user with an increased number of viewing angles and vantage points into a patient. Additionally, an ability to overlay different scans and virtual representations gives the user a unique visual context when viewing the images. The VR goggles can display internal views of organs and blood vessels in addition to the surface views. ) where the user can view the volume from the side. Alternative, there is a side viewpoint of slices when they have a thickness (Coustaud C8 L10-25 In certain examples, tools 230 such as a slice thickness adjustment tool can be provided to a user in the virtual environment 200. In certain examples, a slice thickness indicator can be displayed and manipulated by the user in the virtual environment 200 to adjust the displayed slice thickness. A slice selector can also be displayed as part of or separate from the menu of tools 230. For example, a user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc. In certain examples, slice selection and/or slice thickness adjustment can be facilitated via one or more tools 230 in the virtual environment 200 by sliding a bar, indicator, other slider, etc., to select a slice, expand or contract selected slice thickness, etc.)
and at least one target tissue internal the organ (Coustaud C6 L1-5 In certain examples, the virtual environment displays external view(s) of an anatomy of interest, internal view(s) of an anatomy of interest (e.g., inside organs, vessels, along endoscopic viewing angles, etc.), combinations of internal and external views, etc.)(Coustaud C16 L10-20 In certain examples, masks can be applied to display segmented anatomy(-ies). A MIP selection, orientation, window width, window level, etc., can be adjusted for one or more of the representations. Volume rendered opacity can be adjusted, for example. Masks can be applied to volume-rendered image(s) to display segmented anatomy(-ies), for example.)(Coustaud C8 L40-50 Different scans, different planes, etc., can be adjusted in the virtual environment 200. The user can interact with each individual representation (e.g., anatomic models, MIP views, 2D slice, MR image, CT image, etc.) to adjust thickness, opacity, position, overlay, measurement, annotation, etc., in the virtual environment 200.) since inside organs can be viewed, particularly with opacity adjustments.
Coustaud discloses the above elements in several embodiments. With the embodiments being disclosed in a single reference, one of ordinary skill in the art before the effective filing date of the invention being aware of one embodiment would also have been aware of the others, and it would have been obvious to one of ordinary skill in the art at the time of the filing of the invention to have combined these elements from two or more embodiments into a single arrangement for the benefit of enjoying the advantages of all the embodiments disclosed combined into a single arrangement.
Regarding claim 2:
The image processing device according to claim 1, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the processor is configured to output, as a second display image, a cross section image that shows a cross section of a human body including the organ and on which a position of the viewpoint of the first display image is displayed, and perform display control for displaying the first display image and the second display image in parallel on a display screen (Coustaud C8 L50-65 FIGS. 3-10 illustrate example implementations, modes, or snapshots of the example virtual environment 200 provided for user viewing and interaction. FIG. 3 illustrates an example of the virtual environment 200 in which a 2D image 310 is displayed with respect to a 3D volume 320. A dashed plane 330 illustrates a cut of the image 310 with respect to the 3D volume 320. The dashed plane 330 is movable in the virtual environment 200 (e.g., via the hand avatar 220, etc.) to show a particular 2D image cut 310 in the virtual environment 200.)(Coustaud Fig. 8).
Regarding claim 3:
The image processing device according to claim 2, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the cross section image includes a plurality of cross section images that show respective cross sections of an axial cross section, a coronal cross section, and a sagittal cross section (Coustaud C9 L10-30 As shown in the example of FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820. FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example.) since the slice can be moved to show axial, coronal or sagittal cross sections over time. The claim does not require all to be displayed at the same time.
Regarding claim 4:
The image processing device according to claim 1, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the processor is configured to output a plurality of the side viewpoint images having different viewing directions in surroundings of the cut section, and switch and display the plurality of side viewpoint images as the first display image (Coustaud C9 L10-30 As shown in the example of FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820. FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.)(Coustaud Fig. 8-10).
Regarding claim 5:
The image processing device according to claim 4, has all of its limitations taught by Coustaud. Coustaud further teaches wherein, in a case where a head side in a body axis direction is an upside and an opposite side is a downside, the plurality of side viewpoint images include at least a first side viewpoint image obtained by viewing the cut section from the upside and a second side viewpoint image obtained by viewing the cut section from the downside (Coustaud C17 L25-40 In the 3D view in the virtual environment, however, the doctor can select/place plane(s)) (Coustaud C9 L10-30 FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.)(Coustaud C8 L10-20 In certain examples, tools 230 such as a slice thickness adjustment tool can be provided to a user in the virtual environment 200. In certain examples, a slice thickness indicator can be displayed and manipulated by the user in the virtual environment 200 to adjust the displayed slice thickness.) since there is a slice, and the user can look at either side of the slice which will show the upside and downside when horizontal.
Regarding claim 6:
The image processing device according to claim 5, has all of its limitations taught by Coustaud. Coustaud further teaches wherein a first side viewpoint of the first side viewpoint image and a second side viewpoint of the second side viewpoint image are set on a reference line passing through a reference point set in advance within the cut section (Coustaud C17 L25-40 In the 3D view in the virtual environment, however, the doctor can select/place plane(s)) (Coustaud C9 L10-30 FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.)(Coustaud C8 L10-20 In certain examples, tools 230 such as a slice thickness adjustment tool can be provided to a user in the virtual environment 200. In certain examples, a slice thickness indicator can be displayed and manipulated by the user in the virtual environment 200 to adjust the displayed slice thickness.) since the slice is the reference line.
Regarding claim 7:
The image processing device according to claim 6, has all of its limitations taught by Coustaud. Coustaud further teaches wherein one of the first side viewpoint and the second side viewpoint is settable as an initial position (Coustaud C7 L15-30 FIG. 2 depicts an example virtual environment 200 including a 2D image slice 205 overlaid on a 3D image volume 210. Within the virtual environment 200, the user can interact with the composite of 2D 205 and 3D 210 image information via an avatar and/or other virtual representation of themselves, such as a stylized hand 220, etc. The virtual hand 220 can mimic the user's hand movement and/or otherwise be controlled (e.g., by gesture, by touchpad, by mouse, by joystick, by sensor glove, etc.) to move to a desired location in the virtual environment 200. In certain examples, a menu 230 can be activated in the virtual environment 200 to facilitate manipulation of the image(s) 205, 210, annotation, processing, overlay of another image, etc. In certain examples, the user can move the image(s) 205, 210 backward, forward, sideways, rotate, peel apart, combine together, zoom in/out, etc., by manipulating the image(s) 205, 210 via the hand/cursor/other virtual tool 220.) since the user places the slice, they determine the initial position and can place the slice such that the first or second side is visible initially.
Regarding claim 8:
The image processing device according to claim 7, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the second side viewpoint is set as the initial position (Coustaud C7 L15-30 FIG. 2 depicts an example virtual environment 200 including a 2D image slice 205 overlaid on a 3D image volume 210. Within the virtual environment 200, the user can interact with the composite of 2D 205 and 3D 210 image information via an avatar and/or other virtual representation of themselves, such as a stylized hand 220, etc. The virtual hand 220 can mimic the user's hand movement and/or otherwise be controlled (e.g., by gesture, by touchpad, by mouse, by joystick, by sensor glove, etc.) to move to a desired location in the virtual environment 200. In certain examples, a menu 230 can be activated in the virtual environment 200 to facilitate manipulation of the image(s) 205, 210, annotation, processing, overlay of another image, etc. In certain examples, the user can move the image(s) 205, 210 backward, forward, sideways, rotate, peel apart, combine together, zoom in/out, etc., by manipulating the image(s) 205, 210 via the hand/cursor/other virtual tool 220.) since the user places the slice, they determine the initial position and can place the slice such that the first or second side is visible initially.
Regarding claim 9:
The image processing device according to claim 2, has all of its limitations taught by Coustaud. Coustaud further teaches wherein an intersection position where an extension line in a visual line direction of the set side viewpoint intersects a body surface is displayable (Coustaud Fig. 8)(Coustaud C8 L50-60 FIG. 3 illustrates an example of the virtual environment 200 in which a 2D image 310 is displayed with respect to a 3D volume 320. A dashed plane 330 illustrates a cut of the image 310 with respect to the 3D volume 320.).
Regarding claim 10:
The image processing device according to claim 1, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the side viewpoint is set according to at least one of information based on an input of a user, information regarding the organ, or information regarding an operative method for cutting the organ (Coustaud C9 L10-30 FIG. 8 illustrates grasping and removing of a 2D image slice 810 from a 3D image volume 820 using the hand avatar 220 in the virtual environment 200. […] The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.).
Regarding claim 11:
The image processing device according to claim 2, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the processor is configured to change another viewpoint in conjunction in a case where the viewpoint is changed in one of the first display image and the second display image on the display screen (Coustaud Fig. 8)(Coustaud C9 L10-30 As shown in the example of FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820. FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc., using the hands 220, for example.) where the environment (including the 3D volume) can be rotated and the slice can be rotated, which will update the secondary slice view.
Regarding claim 14:
The image processing device according to claim 1, has all of its limitations taught by Coustaud. Coustaud further teaches wherein the first display image is switchable to a viewpoint image obtained by viewing the organ from a viewpoint different from the side viewpoint (Coustaud C9 L10-30 As shown in the example of FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820. FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example.) since manipulating the slice changes the viewpoint.
Regarding claim 15 (independent):
The claim is a parallel version of claim 1. As such it is rejected under the same teachings.
Regarding claim 16 (independent):
The claim is a parallel version of claim 1. As such it is rejected under the same teachings.
Claim(s) 12-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Coustaud U.S. Patent/PG Publication 11062527 in view of Rioux U.S. Patent/PG Publication 6268863.
Regarding claim 12:
The image processing device according to claim 1, has all of its limitations taught by Coustaud. Coustaud does not teach image filtering although they teach saving the slice data (Coustaud C14 L50-60 At block 1310, an output is generated from the virtual environment 200. For example, a composite image, image slice, image volume, annotated tumor measurement, etc., can be generated and output to a secondary display 150, an external system 160, data storage 115, other image data processing and/or clinical decision support application, etc. At block 1312, the output is saved. For example, the image content and/or related annotation, measurement, reporting, etc., information can be saved in the memory 115, routed to storage and/or application at the external system 160, provided to an image archive, radiology desktop workstation, etc.). In a related field of endeavor, Rioux teaches:
wherein the processor is configured to acquire optical characteristic information representing an optical characteristic of a camera, and execute characteristic reflection processing of reflecting the optical characteristic in the first display image based on the optical characteristic information (Rioux C6 L60-C7 L5 In order to render the virtual object, a camera is simulated. The camera has parameter values similar to real photographic cameras. A perspective is providing to the rendering system. A lens focal length or other lens identifier is selected from predetermined lenses. The imaging plane of the virtual camera, where film is located in its real counterpart, is oriented relative to the virtual lens. Film speed, f-stop, focus, and shutter speed are selected from available values.)(Rioux C7 L20-30 Referring to FIG. 6, another user interface is shown for entering the parameter values. Here, an image of a camera is provided having input/output analogous to that of a photographic camera and representing values of parameters. An f-stop lever 100 is slidable across the lens 300.).
Therefore, it would have been obvious before the effective filing date of the claimed invention to apply a filter as taught by Rioux. The rationale for doing so would have been that it combines prior art elements according to known methods to yield predictable results, where Coustaud has a 3D scene that is rendered to a 2d slice and for saving, and Rioux renders 3D scenes using changeable parameters, where the end result is merely an aesthetic difference in the output of the Coustaud image with no change to the functionality of the medical imaging. Further, the motivation to combine would be to provide the user of Coustaud with more aesthetic customization when shared (Coustaud C11 L5-15). Therefore it would have been obvious to combine Rioux with Coustaud to obtain the invention.
Regarding claim 13:
The image processing device according to claim 12, has all of its limitations taught by Coustaud in view of Rioux. Rioux further teaches wherein at least one of a distortion characteristic (Rioux C6 L60-C7 L5 In order to render the virtual object, a camera is simulated. The camera has parameter values similar to real photographic cameras. A perspective is providing to the rendering system. A lens focal length or other lens identifier is selected from predetermined lenses. The imaging plane of the virtual camera, where film is located in its real counterpart, is oriented relative to the virtual lens. Film speed, f-stop, focus, and shutter speed are selected from available values.)(Rioux C7 L20-30 Referring to FIG. 6, another user interface is shown for entering the parameter values. Here, an image of a camera is provided having input/output analogous to that of a photographic camera and representing values of parameters. An f-stop lever 100 is slidable across the lens 300.).
Therefore, it would have been obvious before the effective filing date of the claimed invention to apply a filter as taught by Rioux. The rationale for doing so would have been that it combines prior art elements according to known methods to yield predictable results, where Coustaud has a 3D scene that is rendered to a 2d slice and for saving, and Rioux renders 3D scenes using changeable parameters, where the end result is merely an aesthetic difference in the output of the Coustaud image with no change to the functionality of the medical imaging. Further, the motivation to combine would be to provide the user of Coustaud with more aesthetic customization when shared (Coustaud C11 L5-15). Therefore it would have been obvious to combine Rioux with Coustaud to obtain the invention.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON PRINGLE-PARKER whose telephone number is (571) 272-5690 and e-mail is jason.pringle-parker@uspto.gov. The examiner can normally be reached on 8:30am-5:00pm est Monday-Friday. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, King Poon can be reached on (571) 270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, seehttp://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JASON A PRINGLE-PARKER/
Primary Examiner, Art Unit 2617