DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Coustaud et al. (U.S. Pub. No. 2021/0327159).
Regarding claim 1, Coustaud discloses an image processing apparatus comprising: a processor (paragraph 2, line(s) 1-2 "medical image processing"; also, FIG. 1; also, paragraph 4, line(s) 2 "interaction apparatus"), wherein the processor is configured to: display, on a screen (paragraph 13, line(s) 4-6 "processor to at least: generate a virtual environment for display of image content"), a plurality of two-dimensional images that are used to generate a three-dimensional image showing a target object in a real space and that are associated with a plurality of portions of the three-dimensional image (paragraph 5, line(s) 1-3, "two-dimensional image slice overlaid on a three-dimensional image"; also, FIG, 2; also, paragraph 19, line(s) 18-21 "display a virtual representation of the organ and the tumor so that the user can better visualize the exact location of the tumor in a 3D environment"; also, paragraph 19, line(s) 22-25 "Certain examples also provide a method and mechanism to scan and/or to take a previously scanned 3D object, such as a tool, implant, etc., and display it in the VR environment"; also, paragraph 38, line(s) 6-9 "user can access to the full 3D/4D content to identify a region of interest and/or an object in the image(s) 205, 210, segment the image(s) 205, 210, manipulate the image(s) 205, 210, etc.), and the three-dimensional image in a state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same") select a two-dimensional image of interest from among the plurality of two-dimensional images in response to a given selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected"); and display, on the screen, a portion of interest corresponding to the two-dimensional image of interest among the plurality of portions in a state in which the portion of interest is visually specifiable (FIG. 8-10; also, paragraph 47, line(s) 7-9 "the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Regarding claim 2, Coustaud discloses the image processing apparatus according to claim 1, wherein the state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same") is a state in which a first region including the plurality of two-dimensional images and a second region including the three-dimensional image are arranged (FIG. 3-4; also, paragraph 45, line(s) 4-5 "a 2D image 310 is displayed with respect to a 3D volume 320"; also, Paragraph 45, line(s) 13 "3D image volume 320, 2D image slice 310").
Regarding claim 3, Coustaud discloses the image processing apparatus according to claim 1, wherein the state in which the portion of interest is visually specifiable includes a state in which the portion of interest is distinguishable from remaining portions among the plurality of portions (FIG. 8-10; also, paragraph 47, line(s) 7-9 "the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Regarding claim 4, Coustaud discloses the image processing apparatus according to claim 1, wherein the state in which the portion of interest is visually specifiable includes a state in which the two-dimensional image of interest is distinguishable from remaining two-dimensional images among the plurality of two-dimensional images (FIG. 8-10; also, paragraph 47, line(s) 7-9 "FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Regarding claim 5, Coustaud discloses the image processing apparatus according to claim 1, wherein the processor is configured to: display, on the screen (paragraph 13, line(s) 4-6 "processor to at least: generate a virtual environment for display of image content") a plurality of position specifying images in which a plurality of imaging positions at which imaging for obtaining the plurality of two-dimensional images is performed are specifiable (FIG. 8-10; also, paragraph 47, line(s) 3-5 "ability to zoom into certain areas of an exam as well as an ability to rotate the image for a different viewing angle"; also, paragraph 47, line(s) 7-9 "FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820"; also, paragraph 47, line(s) 9-20 "FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc."; also, paragraph 3, line(s) 16-18 "ability to zoom into certain areas of an exam as well as an ability to rotate the image for a different viewing angle"; also, paragraph 19, line(s) 8-10 "The images can include 2D images, such as an X-ray or CT scan, that can now be rotated and magnified in the VR environment."), in a state in which the plurality of position specifying images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"); select an imaging position corresponding to a position specifying image of interest, which is selected from among the plurality of position specifying images (paragraph 47, line(s) 9-20 "FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc."; also, paragraph 3, line(s) 16-18 "ability to zoom into certain areas of an exam as well as an ability to rotate the image for a different viewing angle"), as an imaging position of interest from among the plurality of imaging positions in response to the selection instruction; and select a two-dimensional image obtained by performing the imaging from the imaging position of interest as the two-dimensional image of interest from among the plurality of two-dimensional images (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected").
Regarding claim 6, Coustaud discloses the image processing apparatus according to claim 5, wherein the state in which the plurality of position specifying images and the three-dimensional image are comparable with each other includes a state in which the plurality of position specifying images and the three-dimensional image face each other (FIG. 8-10; also, paragraph 47, line(s) 7-9 "FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820"; also, paragraph 47, line(s) 9-20 "FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc.").
Regarding claim 7, Coustaud discloses the image processing apparatus according to claim 5, wherein the state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same") is a state in which a third region including the plurality of two-dimensional images and a fourth region (paragraph 79, line(s) 1-2 "two or more representations are registered in the virtual environment 200") including an image showing an aspect in which the plurality of position specifying images and the three-dimensional image face each other are arranged (FIG. 8-10; also, paragraph 47, line(s) 7-9 "FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820"; also, paragraph 47, line(s) 9-20 "FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle. The angle of viewing can be adjusted by manipulating the avatar 220, selecting a menu option 230 (not shown), moving a scroll wheel and/or touch pad, clicking a button, etc. FIG. 10 shows a horizontal 2D image slice 1010, rather than a vertical image slice 810, being selected from the 3D volume 1020 and being manipulated by the avatar hands 220 to separate the 2D image 1010 from the 3D image volume 1020, for example. The user can pull the slice 1010 towards him/her, push the image slice 1010 away, etc.").
Regarding claim 8, Coustaud discloses the image processing apparatus according to claim 5, wherein the state in which the portion of interest is visually specifiable includes a state in which the position specifying image of interest is distinguishable from remaining position specifying images among the plurality of position specifying images (FIG. 8-10; also, paragraph 47, line(s) 7-9 "the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Regarding claim 9, Coustaud discloses the image processing apparatus according to claim 5, wherein the image processing apparatus has a first operation mode in which the plurality of two-dimensional images and the three-dimensional image are displayed on the screen (paragraph 46, line(s) 1-9 " FIG. 5, the 2D slice 310 can be extracted from the 3D volume 320 in the virtual environment 200 to be viewed separately in the virtual environment 200 by the user. The example of FIG. 6 shows that, using the avatar hand 220, a user can maneuver a composite image formed from a PET image 610 overlaid on a CT image 620 in the virtual environment 200. As shown in the example of FIG. 7, tumors 710, 715 can be highlighted in a cut image 720 in the virtual environment 200") in the state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"), and a second operation mode in which the plurality of position specifying images are displayed on the screen (paragraph 46, line(s) 1-9 " FIG. 5, the 2D slice 310 can be extracted from the 3D volume 320 in the virtual environment 200 to be viewed separately in the virtual environment 200 by the user. The example of FIG. 6 shows that, using the avatar hand 220, a user can maneuver a composite image formed from a PET image 610 overlaid on a CT image 620 in the virtual environment 200. As shown in the example of FIG. 7, tumors 710, 715 can be highlighted in a cut image 720 in the virtual environment 200") in the state in which the plurality of position specifying images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"), and the processor is configured to set any operation mode of the first operation mode or the second operation mode in response to a given setting instruction (paragraph 69, line(s) 10-16 "using the virtual hand 220, the user can move the image content closer, farther, angled, rotated, pull a 2D slice out of a 3D volume, put a 2D slice in a 3D volume, apply a tool and/or other menu 230 option to the image content, position a medical implant and/or instrument with respect to an anatomy in the image content, switch to a different slice view, etc., in the virtual environment 200").
Regarding claim 10, Coustaud discloses the image processing apparatus according to claim 5, wherein the three-dimensional image is displayed on the screen from a viewpoint corresponding to the two-dimensional image of interest (paragraph 47, line(s) 9-11 "FIG. 9 shows the virtual environment 200 rotated as the user views the 3D volume 820 and 2D slice 810 from a different angle.").
Regarding claim 11, Coustaud discloses an image processing apparatus comprising: a processor (paragraph 2, line(s) 1-2 "medical image processing"; also, FIG. 1; also, paragraph 4, line(s) 2 "apparatus includes at least one processor"), wherein the processor is configured to: display, on a screen (paragraph 13, line(s) 4-6 "processor to at least: generate a virtual environment for display of image content"), a plurality of two-dimensional images that are used to generate a three-dimensional image showing a target object in a real space and that are associated with a plurality of portions of the three-dimensional image (paragraph 5, line(s) 1-3, "two-dimensional image slice overlaid on a three-dimensional image"; also, FIG, 2; also, paragraph 19, line(s) 18-21 "display a virtual representation of the organ and the tumor so that the user can better visualize the exact location of the tumor in a 3D environment"; also, paragraph 19, line(s) 22-25 "Certain examples also provide a method and mechanism to scan and/or to take a previously scanned 3D object, such as a tool, implant, etc., and display it in the VR environment"; also, paragraph 38, line(s) 6-9 "user can access to the full 3D/4D content to identify a region of interest and/or an object in the image(s) 205, 210, segment the image(s) 205, 210, manipulate the image(s) 205, 210, etc.), and the three-dimensional image in a state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"); select a portion of interest from among the plurality of portions in response to a given selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected"); select a two-dimensional image of interest corresponding to the portion of interest from among the plurality of two-dimensional images; and display, on the screen, the plurality of two-dimensional images in a state in which the two-dimensional image of interest is distinguishable from remaining two-dimensional images among the plurality of two-dimensional images (FIG. 6-7; also, paragraph 44, line(s) 2-10 "display a part of a segmented anatomy in the virtual environment 200. The segmented MRI information can be combined with CT scanner data. Different scans, different planes, etc., can be adjusted in the virtual environment 200. The user can interact with each individual representation (e.g., anatomic models, MIP views, 2D slice, MR image, CT image, etc.) to adjust thickness, opacity, position, overlay, measurement, annotation, etc., in the virtual environment 200.").
Regarding claim 12, Coustaud discloses the image processing apparatus according to claim 11, wherein the processor is configured to: display, on the screen, (paragraph 13, line(s) 4-6 "processor to at least: generate a virtual environment for display of image content"), a plurality of position specifying images in which a plurality of imaging positions at which imaging for obtaining the plurality of two-dimensional images is performed are specifiable (FIG. 8-10; also, paragraph 47, line(s) 7-9 "FIG. 8, the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820"), in a state in which the plurality of position specifying images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same") select a position specifying image of interest from among the plurality of position specifying images in response to the selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected"); and select the two-dimensional image obtained by performing the imaging from the imaging position specified from the position specifying image of interest as the two-dimensional image of interest from among the plurality of two-dimensional images (paragraph 45, line(s) 11-17 "FIG. 4 shows another view of the virtual environment 20 including the 3D image volume 320, 2D image slice 310, and slice selection control 340 to select the slice 310 for viewing and interaction. As shown in the example of FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same or of different image types").
Regarding claim 13, Coustaud discloses an image processing method comprising: displaying, on a screen (paragraph 13, line(s) 4-6 "processor to at least: generate a virtual environment for display of image content"), a plurality of two-dimensional images that are used to generate a three-dimensional image showing a target object in a real space and that are associated with a plurality of portions of the three-dimensional image (paragraph 5, line(s) 1-3, "two-dimensional image slice overlaid on a three-dimensional image"; also, FIG, 2; also, paragraph 19, line(s) 18-21 "display a virtual representation of the organ and the tumor so that the user can better visualize the exact location of the tumor in a 3D environment"; also, paragraph 19, line(s) 22-25 "Certain examples also provide a method and mechanism to scan and/or to take a previously scanned 3D object, such as a tool, implant, etc., and display it in the VR environment"; also, paragraph 38, line(s) 6-9 "user can access to the full 3D/4D content to identify a region of interest and/or an object in the image(s) 205, 210, segment the image(s) 205, 210, manipulate the image(s) 205, 210, etc.), and the three-dimensional image in a state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"); selecting a two-dimensional image of interest from among the plurality of two-dimensional images in response to a given selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected"); and the three-dimensional image in a state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same"); selecting a two-dimensional image of interest from among the plurality of two-dimensional images in response to a given selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected"); and displaying, on the screen, a portion of interest corresponding to the two-dimensional image of interest among the plurality of portions in a state in which the portion of interest is visually specifiable (FIG. 8-10; also, paragraph 47, line(s) 7-9 "the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Regarding claim 14, Coustaud discloses a non-transitory computer-readable storage medium storing a program for causing a computer to execute a process displaying, on a screen (paragraph 14, line(s) 1-4 "non-transitory computer readable storage medium includes instructions which, when executed, cause at least one processor to at least: generate a virtual environment for display of image content"), a plurality of two-dimensional images that are used to generate a three-dimensional image showing a target object in a real space and that are associated with a plurality of portions of the three-dimensional image (paragraph 5, line(s) 1-3, "two-dimensional image slice overlaid on a three-dimensional image"; also, FIG, 2; also, paragraph 19, line(s) 18-21 "display a virtual representation of the organ and the tumor so that the user can better visualize the exact location of the tumor in a 3D environment"; also, paragraph 19, line(s) 22-25 "Certain examples also provide a method and mechanism to scan and/or to take a previously scanned 3D object, such as a tool, implant, etc., and display it in the VR environment"; also, paragraph 38, line(s) 6-9 "user can access to the full 3D/4D content to identify a region of interest and/or an object in the image(s) 205, 210, segment the image(s) 205, 210, manipulate the image(s) 205, 210, etc.), and the three-dimensional image in a state in which the plurality of two-dimensional images and the three-dimensional image are comparable with each other (FIG. 3, 4; also, paragraph 45, line(s) 15-17 "FIGS. 3-4, the 2D image slice 310 and the 3D image volume 320 can be of the same") selecting a two-dimensional image of interest from among the plurality of two-dimensional images in response to a given selection instruction (paragraph 42, line(s) 7-10 "user can select a slice from within a displayed 3D volume to focus operations, 2D slice views, combination between the 2D slice and the 3D volume, etc."; also, paragraph 45, line(s) 10-11 "A control 340 allows a 2D cut or slice 310 of the 3D image volume 320 to be selected") and displaying, on the screen, a portion of interest corresponding to the two-dimensional image of interest among the plurality of portions in a state in which the portion of interest is visually specifiable (FIG. 8-10; also, paragraph 47, line(s) 7-9 "the 2D image slice 810 can be viewed with respect to the 3D volume 820 as well as separate from the 3D volume 820").
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAI WEI TOMMY LI whose telephone number is (571)272-1170. The examiner can normally be reached 6:00AM-4:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xiao Wu can be reached at (571) 272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JAI W LI/Junior Examiner, Art Unit 2613
/XIAO M WU/Supervisory Patent Examiner, Art Unit 2613