DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 10/30/2024 and 08/09/2024 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-4, 6-11 and 13-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Noman et al. (US PAP 2022/0160322 A1).
With respect to claim 1, Homan et al. teaches an imaging system (100) comprising (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131):
PNG
media_image1.png
425
523
media_image1.png
Greyscale
a processor (10) coupled with the imaging system (100) (see Fig. 2; paragraph 0056); and a memory (see paragraphs 0135 and 0142) storing instructions thereon that, when executed by the processor (10), cause the processor (10) to: perform a preview scan process associated with scanning a patient anatomy, wherein performing the preview scan process comprises capturing one or more localization images comprising a portion of the patient anatomy (see Figs. 3, step (202) ; Figs. 11 and 12; paragraph 0059); and capture one or more multidimensional images comprising at least the portion of the patient anatomy based on target coordinates associated with the one or more localization images (see Figs. 2, 3, 11 and 12 paragraphs 0065, 0059 and 0087).
PNG
media_image2.png
720
496
media_image2.png
Greyscale
With respect to claim 2, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: based on the target coordinates, generate movement data associated with positioning or orienting at least one of a radiation source (104), a detector (106), a rotor, and a gantry (110) of the imaging system in association with capturing the one or more multidimensional images (see Figs. 1-3, 11 and 12; paragraphs 0093- 0095).
With respect to claim 3, Homan et al. teaches the imaging system of claim 2 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: control the positioning or orienting of at least one of the radiation source, the detector, the rotor, and the gantry based on the movement data (see Figs. 1-3, 11 and 12 paragraphs 0093- 0095).
With respect to claim 4, Homan et al. teaches the imaging system of claim 2 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: display guidance information associated with the positioning or orienting of at least one of the radiation source, the detector, the rotor, and the gantry based on the movement data (see Figs. 1-3, 11 and 12 paragraphs 0093- 0095).
With respect to claim 6, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the one or more localization images comprise: a first localization image of a first image type comprising the portion of the patient anatomy; and a second localization image of a second image type comprising the portion of the patient anatomy, wherein capturing the one or more multidimensional images is based on at least one of the first localization image and the second localization image (see paragraph 0069).
With respect to claim 7, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: set the target coordinates in response to a user input associated with the imaging system (see paragraphs 0047 and 0129).
With respect to claim 8, Homan et al. teaches the imaging system of claim 7 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the user input comprises an input via at least one of: a display; a controller device; and an audio input device (see paragraphs 0047 and 0129).
With respect to claim 9, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: generate a field of view representation of the patient anatomy comprising the one or more localization images; and update the field of view representation based on the target coordinates (see paragraphs 0087).
With respect to claim 10, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: capture one or more second localization images comprising the portion of the patient anatomy based on the target coordinates, wherein capturing the one or more multidimensional images is based on: the one or more second localization images; one or more second target coordinates associated with the one or more second localization images; or both (see paragraph 0069).
With respect to claim 11, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions executable to capture the one or more multidimensional images are further executable by the processor to capture an image volume comprising at least the portion of the patient anatomy based on image data of the one or more multidimensional images (see paragraph 0118).
With respect to claim 13, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein each of the one or more localization images is: an anterior-posterior image, a posterior-anterior image, a lateral image, an oblique image, an axial image, a coronal image, or a sagittal image (see paragraph 0087).
With respect to claim 14, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein capturing the one or more localization images is based on: one or more positions of a radiation source of the imaging system; and pose information of a subject with respect to the imaging system (see paragraph 0087).
With respect to claim 15, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: display orientation information associated with the patient anatomy and the one or more localization images (see Figs. 11 and 12; paragraphs 0130 and 0131).
With respect to claim 16, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein: the one or more localization images comprise a first localization image and a second localization image; and the instructions are further executable by the processor to: pause setting of second target coordinates in association with the second localization image in response to detecting an active user input of setting first target coordinates in association with the first localization image (see Figs. 11 and 12; paragraphs 0129, 0130 and 0131); and enable the setting of the second target coordinates in response to detecting completion of the active user input (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131).
With respect to claim 17, Homan et al. teaches a system comprising (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131): a processor (10); and a memory storing instructions thereon that, when executed by the processor, cause the processor to: perform a preview scan process associated with scanning a patient anatomy, wherein performing the preview scan process comprises capturing one or more localization images comprising a portion of the patient anatomy (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131); and capture one or more multidimensional images comprising at least the portion of the patient anatomy based on target coordinates associated with the one or more localization images (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131).
With respect to claim 18, Homan et al. teaches the system of claim 17 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor to: based on the target coordinates, generate movement data associated with positioning or orienting at least one of a radiation source (104), a detector (106), a rotor, and a gantry (110) of the system in association with capturing the one or more multidimensional images (see Figs. 1-3, 11 and 12 paragraphs 0093-0095).
With respect to claim 19, Homan et al. teaches the system of claim 18 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131), wherein the instructions are further executable by the processor (10) to: control the positioning or orienting of at least one of the radiation source (104), the detector (106), the rotor, and the gantry (110) based on the movement data (see Figs. 1-3, 11 and 12 paragraphs 0093- 0095); display guidance information associated with the positioning or orienting of at least one of the radiation source (104), the detector (106), the rotor, and the gantry (110) based on the movement data; or a combination thereof (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131).
With respect to claim 20, Homan et al. teaches a method comprising (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131): performing, by an imaging system, a preview scan process associated with scanning a patient anatomy, wherein performing the preview scan process comprises capturing one or more localization images comprising a portion of the patient anatomy; and capturing one or more multidimensional images comprising at least the portion of the patient anatomy based on target coordinates associated with the one or more localization images (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Noman et al. (US PAP 2022/0160322 A1) as applied to claim 2 above, and further in view of Howell et al. (“An interactive fluoroscopic method to accurately measure the post-implantation position of pedicle screws”).
With respect to claim 5, Homan et al. teaches the imaging system of claim 2 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131) but fails to explicitly mention that generating the movement data is based on: a pixel size associated with the one or more localization images; and a focal plane associated with the preview scan process.
Howell et al. discloses a system/method for X-ray fluoroscopic imaging (see page 1258-1267) which explicitly teaches generating the movement data is based on: a pixel size associated with the one or more localization images; and a focal plane associated with the preview scan process (see page 1259) in order to provide user with the capabilities to improve imaging accuracy without correction for image distortion.
Noman et al. and Howell et al. disclose related methods/apparatuses for scanning patients’ anatomy and capturing one or more localization images.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to provide teachings of generating the movement data is based on: a pixel size associated with the one or more localization images; and a focal plane associated with the preview scan process as suggested by Howell et al. in the apparatus of Noman et al., since such a modification would provide user with the capabilities to improve imaging accuracy without correction for image distortion.
It would have been obvious to treat Noman et al. and Howell et al. as related art whereby an improvement on one of the systems/methods would readily be apparent as an improvement on either of the systems.
The Examiner’s conclusion that claim 5 would have been obvious is based on the fact that all the claimed elements were known in the prior art, that one skilled in the art could have combined the elements as claimed by known methods with no change in their respective functions, and that the combination teaches nothing more than predictable results to one of ordinary skill in the art. KSR, 550 U.S. 398, 82 USPQ2d at 1385 (2007); Sakraida v. AG Pro, Inc., 425 U.S. 273, 282, 189 USPQ 449, 453 (1976); Anderson ’s-Black Rock, Inc. v. Pavement Salvage Co., 396 U.S. 57, 62-63, 163 USPQ 673, 675 (1969); Great Atlantic & P. Tea Co. v. Supermarket Equipment Corp., 340 U.S. 147, 152, 87 USPQ 303, 306 (1950).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Noman et al. (US PAP 2022/0160322 A1) as applied to claim 2 above, and further in view of Hansis et al. (US PAP 2017/0150935 A1).
With respect to claim 12, Homan et al. teaches the imaging system of claim 1 (see abstract; Figs. 1-3, 11 and 12; paragraphs 0047, 0056, 0059, 0075, 0069, 0087, 0093, 0118, 0129, 0130 and 0131) but fails to explicitly mention that the instructions are further executable by the processor to generate a long scan image comprising at least the portion of the patient anatomy based on image data of the one or more multidimensional images.
Hansis et al. discloses a system/method for X-ray fluoroscopic imaging (see abstract; Fig. 2B; paragraph 0062)
PNG
media_image3.png
317
486
media_image3.png
Greyscale
which explicitly teaches that the instructions are further executable by the processor to generate a long scan image comprising at least the portion of the patient anatomy based on image data of the one or more multidimensional images (see abstract; Fig. 2B; paragraph 0062) in order to provide user with the capabilities to improve imaging accuracy with a very easy and reliable identification of the target vertebral level in an interventional setting (see paragraph 0015).
Noman et al. and Hansis et al. disclose related methods/apparatuses for scanning patients’ anatomy and capturing one or more localization images.
It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains to provide teachings of to generate a long scan image comprising at least the portion of the patient anatomy based on image data of the one or more multidimensional images as suggested by Hansis et al. in the apparatus of Noman et al., since such a modification would provide user with the capabilities to improve imaging accuracy with a very easy and reliable identification of the target vertebral level in an interventional setting.
It would have been obvious to treat Noman et al. and Hansis et al. as related art whereby an improvement on one of the systems/methods would readily be apparent as an improvement on either of the systems.
The Examiner’s conclusion that claim 12 would have been obvious is based on the fact that all the claimed elements were known in the prior art, that one skilled in the art could have combined the elements as claimed by known methods with no change in their respective functions, and that the combination teaches nothing more than predictable results to one of ordinary skill in the art. KSR, 550 U.S. 398, 82 USPQ2d at 1385 (2007); Sakraida v. AG Pro, Inc., 425 U.S. 273, 282, 189 USPQ 449, 453 (1976); Anderson ’s-Black Rock, Inc. v. Pavement Salvage Co., 396 U.S. 57, 62-63, 163 USPQ 673, 675 (1969); Great Atlantic & P. Tea Co. v. Supermarket Equipment Corp., 340 U.S. 147, 152, 87 USPQ 303, 306 (1950).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IRAKLI KIKNADZE whose telephone number is (571)272-6494. The examiner can normally be reached 9:00 AM - 6:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David J. Makiya can be reached at 571-272-2273. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Irakli Kiknadze
/IRAKLI KIKNADZE/
Primary Examiner, Art Unit 2884
/I.K./ February 19, 2026