Prosecution Insights
Last updated: April 19, 2026
Application No. 18/680,367

MEDICAL IMAGING METHOD, APPARATUS, AND SYSTEM

Non-Final OA §101§102§103§112
Filed
May 31, 2024
Examiner
GROSS, JASON PATRICK
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
GE Precision Healthcare LLC
OA Round
1 (Non-Final)
64%
Grant Probability
Moderate
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
9 granted / 14 resolved
-5.7% vs TC avg
Strong +62% interview lift
Without
With
+62.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
34 currently pending
Career history
48
Total Applications
across all art units

Statute-Specific Performance

§101
22.2%
-17.8% vs TC avg
§103
35.9%
-4.1% vs TC avg
§102
12.0%
-28.0% vs TC avg
§112
26.1%
-13.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 14 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 8 and 19 are objected to because of the following informalities: To be consistent, “the movement path” recited in claim 8 and 19 should read “the planned movement path.” Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. These claim limitations are: a determination unit which determines a position of a medical accessory in a reference coordinate system, as recited in claim 16; a reconstruction unit which reconstructs a medical image according to a scanning result, as recited in claim 16; a display unit which displays in real time the position of the medical accessory in the medical image, as recited in claim 16; and a generation unit which generates a grid matrix according to the scanning result, as recited in claim 17. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. The corresponding structure to each of the determination unit, reconstruction unit, display unit, and generation unit is one or more processors. (see, e.g., [0009], [0049], and [0132]). If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 6 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 6 recites that “the two-dimensional image corresponding to the position of the medical accessory includes a first two-dimensional image that passes through the position of the medical accessory, and/or a second two-dimensional image that passes through a target point of the medical accessory, and that is perpendicular to the first two-dimensional image passing through the position of the medical accessory.” Claim 6 recites that the 2D image includes a first 2D image and/or a second 2D image. However, the second 2D image is recited as being “perpendicular” to the first 2D image. Based on these limitations, the second 2D image requires the first 2D image to exist. Hence, the “and/or” in claim 6 is inconsistent and causes ambiguity in the claim. For the purposes of a compact prosecution, Examiner is interpreting claim 6 as follows: …wherein the two-dimensional image corresponding to the position of the medical accessory includes a first two-dimensional image that passes through the position of the medical accessory, a second two-dimensional image that passes through a target point of the medical accessory, or the first two-dimensional image and the second two-dimensional image in which the second two-dimensional image is perpendicular to the first two-dimensional image passing through the position of the medical accessory. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claims recite: determining a position of a medical accessory in a reference coordinate system based on a camera image acquired by an auxiliary camera, as recited in claim 1; reconstructing a medical image according to a scanning result and based on the reference coordinate system, as recited in claim 1; generating a grid matrix according to the scanning result, wherein a starting point and/or a target point of a planned movement path of the medical accessory are/is marked in the grid matrix, as recited in claim 3; displaying on the grid matrix in real time the position of the medical accessory and/or information indicating a relative positional relationship between the medical accessory and the starting point and/or the target point, as recited in claim 4; determining the position of the medical accessory in a camera coordinate system according to the camera image, as recited in claim 9; determining the position of the medical accessory in the reference coordinate system according to the position of the medical accessory in the camera coordinate system and a first transformation matrix between the camera coordinate system and the reference coordinate system, as recited in claim 9; recognizing the marker in the camera image, and determining an extension direction and position information of the marker in the camera coordinate system, as recited in claim 10; determining the position of the puncture needle in the camera coordinate system according to the extension direction, the position information, and a relative positional relationship between the marker and the puncture needle, as recited in claim 10; reconstructing the medical image in the reference coordinate system according to the position of the scanning result in a scanning coordinate system and a second transformation matrix between the scanning coordinate system and the reference coordinate system, as recited in claim 14; reconstructing the medical image according to a relative positional relationship between the first position and the second position, as recited in claim 15; a determination unit which determines a position of a medical accessory in a reference coordinate system based on a camera image acquired by an auxiliary camera, as recited in claim 16; a reconstruction unit which reconstructs a medical image according to a scanning result and based on the reference coordinate system, as recited in claim 16; a generation unit which generates a grid matrix according to the scanning result, wherein a starting point and/or a target point of a planned movement path of the medical accessory is marked in the grid matrix, as recited in claim 17. Each of claim limitations [a], [f]-[h], and [k], as drafted and under its broadest reasonable interpretation, recites a mathematical concept and/or mental process. (MPEP 2106.04(a)(2)(I) (see, e.g., Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014) (although the claims did not recite a particular mathematical formula, the court held “[w]ithout additional limitations, a process that employs mathematical algorithms to manipulate existing information to generate additional information is not patent eligible.”)). These claim limitations are mental processes because determining a relative position of a medical accessory (e.g., surgical tool) in a coordinate system (i.e., space) based on a camera image is an observation, evaluation, judgment, or opinion that surgeons have used since incorporating videos into surgical procedures. The claim limitations are also mathematical concepts because, in the context of surgical navigation systems that register different coordinate systems, they require identifying markers and their respective locations within images and then using various mathematical calculations, such as those used for triangulation, rigid transformations, and best-fit alignment (e.g., using least squares), to determine the position of the medical accessory. Each of claim limitations [b], [i], and [l], as drafted and under its broadest reasonable interpretation, recites a mathematical concept. (MPEP 2106.04(a)(2)(I) (see, e.g., Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014) (although the claims did not recite a particular mathematical formula, the court held “[w]ithout additional limitations, a process that employs mathematical algorithms to manipulate existing information to generate additional information is not patent eligible.”)). These claim limitations use various mathematical concepts and calculations, such as converting physical coordinates to pixel or voxel coordinates, resampling/interpolating (e.g., using nearest neighbor, trilinear, or spline interpolation) depending on where the tool is located, and rendering the images, to generate images that accurately predict the location of the surgical tool within human anatomy. Claim limitations [j], as drafted and under its broadest reasonable interpretation, recites a mathematical concept. (MPEP 2106.04(a)(2)(I) (see, e.g., Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014) (although the claims did not recite a particular mathematical formula, the court held “[w]ithout additional limitations, a process that employs mathematical algorithms to manipulate existing information to generate additional information is not patent eligible.”)). The claim limitation uses various mathematical concepts, such as converting physical coordinates to pixel or voxel coordinates, resampling/interpolating (e.g., using nearest neighbor, trilinear, or spline interpolation) depending on where the tool is located, and rendering the images, to account for movement of the patient/table and generate images that accurately predict the location of the surgical tool within human anatomy. Each of claim limitations [c], [d], and [m], as drafted and under its broadest reasonable interpretation, recites a mathematical concept. (MPEP 2106.04(a)(2)(I) (see, e.g., Digitech Image Techs., LLC v. Electronics for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014) (although the claims did not recite a particular mathematical formula, the court held “[w]ithout additional limitations, a process that employs mathematical algorithms to manipulate existing information to generate additional information is not patent eligible.”)). These claim limitations use various mathematical concepts and calculations, such as converting physical coordinates to pixel or voxel coordinates, resampling/interpolating (e.g., using nearest neighbor, trilinear, or spline interpolation) depending on where a planned path is located, and rendering the images, to generate images that superimpose a grid matrix over human anatomy in which the grid matrix optionally includes a starting point, target point, or other indications of the planned path. The next question is to consider whether the claims integrate the judicial exception into a practical application. A claim that integrates a judicial exception into a practical application will apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. (MPEP 2106.04(d)). In this case, some additional elements/steps to consider include that: (1) the reference coordinate system is a world coordinate system having an origin of the world coordinate system as being in the center of the scanning gantry (claim 2); (2) superimposing a grid matrix (claim 3); (3) displaying three-dimensional or two-dimensional images at particular orientations with the medical accessory and other positional information (claims 5, 6, 7, 8 ); (4) the medical accessory includes a puncture needle and a marker (claim 10); (5) the marker has a preset position or angle relative to the needle (claim 11); (6) the marker is a cannula sleeved on a peripheral side of the puncture needle (claim 12); and (7) the marker has a component having a preset temperature, a component having a preset shape and/or pattern, or a component emitting light according to a preset rule (claim 13). Here, the judicial exception is not integrated into a practical application. The additional element/step (1) is a well-understood, routine, conventional activity/element that is known to the industry as different coordinate systems are often registered to one another for surgical navigation, including using the isocenter or origin of the medical imaging system. (MPEP 2106.05(A)) (see, e.g., GREGERSON discussed below and other prior art made of record but not discussed in the rejections). With respect to additional element/step (2), superimposing graphical objects over camera or medical images is a well-understood, routine, conventional activity/element that is known to surgical navigation. (MPEP 2106.05(A)) (see, e.g., JING LI and GREGERSON discussed below). While the grid matrix is a more particular graphical object, it does not impose meaningfully limits on the claim. With respect to additional element/step (3), displaying three-dimensional or two-dimensional images at particular orientations with the medical accessory and other positional information is a well-understood, routine, conventional activity/element that is known to surgical navigation, (MPEP 2106.05(A)) (see, e.g., GREGERSON), and is also an insignificant post-solution activity. (MPEP 2106.04(d)(I), which also refers to MPEP 2106.05(g). With respect to additional element/steps (4), (5), and (7), the medical accessory including a puncture needle and a marker having a known pattern and position/orientation with respect to the accessory is a well-understood, routine, conventional activity/element that is known to surgical navigation. (MPEP 2106.05(A)) (see, e.g., JING LI and STANTON). With respect to additional element/step (6), the marker being attached to a cannula sleeved on a peripheral side of the puncture needle is a well-understood, routine, conventional activity/element that is known to surgical navigation. (MPEP 2106.05(A)) (see, e.g., STANTON). The claims do not include additional elements/steps that are sufficient to amount to significantly more than the judicial exception. A shared quality of the additional elements and/or steps is that they do not recite any meaningful limitation that transforms the judicial exception into a patent-eligible application. (MPEP 2106.05(II)). As explained above, additional element/steps are either well-understood, routine, conventional activity that is known to the industry and/or recite insignificant extra-solution activity that do not impose meaningfully limits. Accordingly, the claims do not recite patent-eligible subject matter. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 9-11, 13, 14, and 16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Li, Jing, et al. “A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation.” Computers in Biology and Medicine 136 (2021) (hereinafter “JING LI”). With respect to claim 1, JING LI teaches a medical imaging method. JING LI teaches a “marker-less surgical registration method” that NIR binocular tracking and CT imaging (Abstract and Figure 1 shown here). The method comprising: PNG media_image1.png 200 400 media_image1.png Greyscale determining a position of a medical accessory in a reference coordinate system based on a camera image acquired by an auxiliary camera. The medical accessory (“surgical ablation needle” shown in Figure 1) has markers attached thereto and is tracked by the NIR vision tracking system consisting of “two sets of CMOS industrial cameras, near infrared-filters and near-infrared light sources….” (p.2, top of right column). The reference coordinate system is the “word coordinate system” shown in Figure 1 and multiple other “coordinate systems” are aligned with the world coordinate system. (p.2, right column, third full paragraph). reconstructing a medical image according to a scanning result and based on the reference coordinate system. “Computed Tomography coordinate system (CSYSCT) is used to describe the image space. And the transformation relationship between CSYSCT and CSYSWorld is the key to the surgical navigation system, which is obtained through the surgical registration….” (p.2, right column, third full paragraph). This method step occurs whenever the medical image is constructed showing the needle in real-time. displaying in real time the position of the medical accessory in the medical image. “As shown in Fig. 16, the surgeon adjusts the ablation needle in real time according to the visualization interface of the navigation software to make the virtual ablation needle and its tip in the visualization interface consistent with the planned puncturing path and target.” (p.9, right column, middle paragraph; see also Figure 16). With respect to claim 9, JING LI teaches wherein determining the position of the medical accessory in the reference coordinate system based on the camera image acquired by the auxiliary camera includes determining the position of the medical accessory in a camera coordinate system according to the camera image (“The four reflective balls fixed on the fixture of surgical instrument are identified by the binocular vision tracking system as the dynamic marking points which are used to determine the spatial pose of surgical instrument according to transformation characteristics of rigid body.” (p.2, right column, first full paragraph)) and determining the position of the medical accessory in the reference coordinate system according to the position of the medical accessory in the camera coordinate system and a first transformation matrix between the camera coordinate system and the reference coordinate system (“[T]he coordinate systems in optical surgical navigation system include surgical space coordinate system (CSYSWorld), binocular vision tracking system coordinate system (CSYSBV), structured light vision system coordinate system (CSYSSL) and surgical instrument coordinate system (CSYSNeedle), which are unified by calibration using a specially-designed chessboard-circle calibration board integrating chessboard calibration board with asymmetric circular calibration board.”; see also Figure 2 shown here in which the camera PNG media_image2.png 190 380 media_image2.png Greyscale coordinate system “CSYSBV” has a transformation matrix (“TBV->World”) for aligning to the reference coordinate system (i.e., “CSYSWorld”). With respect to claim 10 (depending from claim 9), JING LI teaches wherein the medical accessory comprises a puncture needle and a marker, and determining the position of the medical accessory in the camera coordinate system according to the camera image includes recognizing the marker in the camera image, and determining an extension direction and position information of the marker in the camera coordinate system; and determining the position of the puncture needle in the camera coordinate system according to the extension direction, the position information, and a relative positional relationship between the marker and the puncture needle. (“The four reflective balls fixed on the fixture of surgical instrument are identified by the binocular vision tracking system as the dynamic marking points which are used to determine the spatial pose of surgical instrument according to transformation characteristics of rigid body.” (p.2, right column, first full paragraph); see also Figure 3 shown here). PNG media_image3.png 200 400 media_image3.png Greyscale With respect to claim 11 (depending from claim 10), JING LI discloses that the extension direction of the marker forms a preset included angle with the extension direction of the puncture needle, and/or the marker is fixed at a preset position of the puncture needle. While JING LI does not explicitly state the above, one having ordinary skill in the art would recognize that the reflective balls (“marker”) form a pattern that necessarily forms a preset angle with the extension direction of the puncture needle and that is necessarily fixed at a preset position with respect to the puncture needle. Otherwise, the position of the needle could not be determined. As such, JING LI teaches the limitations of claim 11. With respect to claim 13 (depending from claim 10), JING LI teaches that the marker includes at least a component having a preset shape and/or pattern. (See Figure 3 shown here). With respect to claim 14, JING LI teaches that reconstructing the medical image according to the scanning result and based on the reference coordinate system includes reconstructing the medical image in the reference coordinate system according to the position of the scanning result in a scanning coordinate system and a second transformation matrix between the scanning coordinate system and the reference coordinate system. “Computed Tomography coordinate system (CSYSCT) is used to describe the image space. And the transformation relationship between CSYSCT and CSYSWorld is the key to the surgical navigation system, which is obtained through the surgical registration….” (p.2, right column, third full paragraph). NOTE: Examiner is interpreting “a second transformation matrix between the scanning coordinate system and the reference coordinate system” to include two transformation relationships as shown in Figure 2. More specifically, CSYSCT is first aligned with CSYSSL (structured light) and then aligned with CSYSWorld. With respect to claim 16, JING LI teaches a medical imaging apparatus. JING LI teaches an optical surgical navigation system that includes NIR binocular tracking and CT imaging (Abstract and Figure 1). NOTE: The determination unit and the reconstruction unit are taught by the various functions described in JING LI and the “computer” having a “CPU: i7-7700HA 2.8 GHz” and “GPU: NVIDIA GEForce GTX 1060”. The apparatus includes comprising: a determination unit which determines a position of a medical accessory in a reference coordinate system based on a camera image acquired by an auxiliary camera. The medical accessory (“surgical ablation needle” shown in Figure 1) has markers attached thereto and is tracked by the NIR vision tracking system consisting of “two sets of CMOS industrial cameras, near infrared-filters and near-infrared light sources….” (p.2, top of right column). The reference coordinate system is the “world coordinate system” shown in Figure 1 and multiple other “coordinate systems” are aligned with the world coordinate system. (p.2, right column, third full paragraph). a reconstruction unit which reconstructs a medical image according to a scanning result and based on the reference coordinate system. “Computed Tomography coordinate system (CSYSCT) is used to describe the image space. And the transformation relationship between CSYSCT and CSYSWorld is the key to the surgical navigation system, which is obtained through the surgical registration….” (p.2, right column, third full paragraph). This operation occurs whenever the medical image is constructed showing the needle in real-time. a display unit which displays in real time the position of the medical accessory in the medical image. “As shown in Fig. 16, the surgeon adjusts the ablation needle in real time according to the visualization interface of the navigation software to make the virtual ablation needle and its tip in the visualization interface consistent with the planned puncturing path and target.” (p.9, right column, middle paragraph; see also Figure 16). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2, 5-8, 18, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Li, Jing, et al. “A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation.” Computers in Biology and Medicine 136 (2021) (hereinafter “JING LI”) as applied to claim 1 above, and further in view of U.S. Patent Appl. Publ. No. 2023/0355347 A1 (hereinafter “GREGERSON”). With respect to claim 2, JING LI teaches the claim limitations of claim 1 as set forth above. JING LI also teaches that the reference coordinate system includes a world coordinate system (Figure 1). However, JING LI does not teach that an origin of the world coordinate system is located at a center of a scanning gantry. In the same field of endeavor, GREGERSON teaches methods and systems for performing computer-assisted image-guided surgery. GREGERSON is particularly concerned with minimally-invasive surgeries (see, e.g., [0074]) and various tools can be guided, including a needle (see, e.g., [0049]). Like JING LI, the image data can be pre-operative CT images (e.g., [0032]) and the surgical tools are tracked by an optical sensing device using markers. “The motion tracking system 105 in the embodiment of FIG. 1 includes a plurality of marker devices 119, 202 and 315 and a stereoscopic optical sensor device 111 that includes two or more cameras (e.g., IR cameras).” ([0036]). The system has a display device that displays image data of the patient’s anatomy. ([0040]). “The display device 121 may facilitate planning for a surgical procedure, such as by enabling a surgeon to define one or more target positions in the patient’s body and/or a path or trajectory into the patient’s body for inserting surgical tool(s) to reach a target position while minimizing damage to other tissue or organs of the patient.” ([0040]). GREGERSON teaches that first and second image datasets may be registered to a common coordinate system. ([0045]). “This may include performing a rigid transformation to map each pixel or voxel of the first image dataset into corresponding 3D coordinates (i.e., x, y, z coordinates) of the common coordinate system. A number of techniques may be utilized for registering multiple image datasets. In one non-limiting example of a registration process for x-ray CT imaging data, a pre-scan calibration process may be used to precisely calculate (e.g., within 1 mm) the transformation between the isocenter of the x-ray gantry 40 and the optical sensing device 111. A set of markers 211 (e.g., 3 or more, such as 4-6 markers) may be provided on the surface of the gantry 40, as shown in FIG. 2 . The markers 211 may be within the field of view of the optical sensing device 111 to enable the gantry 40 position to be tracked by the motion tracking system 105.” ([0045]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to use the registration process of GREGERSON to map the image dataset to a common coordinate system using the isocenter of the CT gantry as its origin. JING LI already aligns the CT imaging data to a world coordinate system using a chessboard calibration device. One of ordinary skill in the art would have been motivated to do this because, with the calibration device attached to the CT bed or the CT gantry structure (as shown in GREGERSON), the structured light transformation could be skipped or the GREGERSON registration step could be used in addition to the structured light registration as a redundant registration method that could minimizes movement caused by patient breathing. There would have been a reasonable expectation of success as coordinate systems are frequently registered with one another for surgical navigation. With respect to claim 5, JING LI teaches that wherein the medical image comprises a three-dimensional image “The post-processing and 3D reconstruction of CT images are required to extract the point cloud of abdominal area.” While JING LI concerns two-dimensional images and slices of the CT image data (see, e.g., Figures 2, 4, and 5; see also Figure 14 showing recommended puncturing paths in 2D perspectives), it is not clear that JING LI teaches wherein the method further includes displaying in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory. In the same field of endeavor, GREGERSON teaches methods and systems for performing computer-assisted image-guided surgery. GREGERSON is particularly concerned with minimally-invasive surgeries (see, e.g., [0074]) and various tools can be guided, including a needle (see, e.g., [0049]). Like JING LI, the image data can be pre-operative CT images (e.g., [0032]) and the surgical tools are tracked using markers ([0035]). The system has a display device that displays image data of the patient’s anatomy. ([0040]). “The display device 121 may facilitate planning for a surgical procedure, such as by enabling a surgeon to define one or more target positions in the patient’s body and/or a path or trajectory into the patient’s body for inserting surgical tool(s) to reach a target position while minimizing damage to other tissue or organs of the patient.” ([0040]). Like GREGERSON teaches that the medical image comprises a three-dimensional image and/or a two-dimensional image displayed on one or a plurality of planes. “[A] first display mode may include displaying a 3D image dataset (e.g., an x-ray CT, MRI, sonogram, PET or SPECT image dataset) in multiple two-dimensional slices corresponding to anatomic planes (e.g., axial, sagittal, coronal planes) transecting the patient.” ([0060]). GREGERSON also teaches displaying in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory. “In block 307 of method 300, images of the patient’s anatomy from the first image dataset may be displayed with an overlay of one or more features derived from the second image dataset in the common coordinate system…The images of the patient’s anatomy may include 2D slices of a three-dimensional image dataset (e.g., a tomographic reconstruction)….” ([0048]). “The one or more features derived from the second image dataset that may be displayed overlaying the images of the patient’s anatomy may include graphical depictions of a tool 104, an end effector 102 or another object that is tracked by the motion tracking system 105…The graphical depiction may be a rendering of the actual size and shape of the object or may be a depiction of select features of the object, such as a location of a tip end of the object and/or an orientation of the object.” ([0049]). Regarding the images being in “real time,” Examiner is interpreting subsequent as teaching real-time. “The motion tracking system 105 may repeatedly acquire new images from the optical sensing device 111, and the relative positions and/or orientations of objects within the field of view of the optical sensing device 111 may be updated with each acquisition of new images from the optical sensing device 111...The display device 121 may be updated to reflect any change(s) in the position and/or orientation of the objects within the common coordinate system (e.g., relative to the patient reference arc 115)….” ([0050]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system display in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory as taught in GREGERSON. One of ordinary skill in the art would have been motivated to show the different scan planes relative to the tool and the target, as taught in GREGERSON, because the surgeon desires relevant feedback (i.e., where the needle and tip of the needle is located relative to target) during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that the tool can be shown in different 2D images of the anatomy. With respect to claim 6 (depending from claim 5 and in light of Section 112 rejection), GREGERSON teaches that the two-dimensional image corresponding to the position of the medical accessory includes a first two-dimensional image that passes through the position of the medical accessory, and/or a second two-dimensional image that passes through a target point of the medical accessory, and that is perpendicular to the first two-dimensional image passing through the position of the medical accessory. “The display screen 500 may also display graphical elements illustrating the relationship of each slice 501, 503, 505 relative to the other slices shown on the display screen 500…The display screen 500 may also include graphical representations or renderings of other objects or tools tracked by the motion tracking system 105. In the example of FIG. 5, a graphical representation of a tool 509 is shown in the lower right quadrant of the display screen 500…Similar graphical elements may be displayed in the 2D slice images 501, 503 and 505 to illustrate the position and/or orientation of one or more objects with respect to the patient.” ([0061]). In Figure 6B (shown here), slice 611 PNG media_image4.png 318 580 media_image4.png Greyscale teaches the “first two-dimensional image” and slice 615 teaches the “second two-dimensional image.” Slices 611 and 613 would show the length of the needle as the needle approached the target, and slice 615 intersects the target. ([0064]-[0066]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to display multiple 2D views that are perpendicular to one another so that the tool and the target are shown to the surgeon during the procedure, as taught in GREGERSON. One of ordinary skill in the art would have been motivated to show the different image planes relative to the tool and the target, as taught in GREGERSON, because the surgeon desires relevant feedback (i.e., where the needle and tip of the needle is located relative to target) during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that the tool can be shown in different 2D images of the anatomy. With respect to claim 7 (depending from claim 6 and in light of Section 112 rejection), GREGERSON teaches that the second two-dimensional image comprises marking information of the target point, and wherein the marking information comprises information related to a relative positional relationship between the medical accessory and the target point. Slices 611, 613, and 615 would show a relative positional relationship between the needle as it approached the target point. “The display panel 500 may also enable the surgeon to visualize multiple trajectories or paths extending from the patient’s skin surface through the patient’s anatomy to the target position.” ([0070]) (see also [0054] explaining that “The system 400 may store the positions and/or orientations of user-defined trajectories or target locations within the common coordinate system, and may display graphical representations of such trajectories or target locations on the display(s) 121, 401.”). A distance “d” is also shown. It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to display a 2D image along with marking information that is related to a relative positional relationship between the medical accessory and the target point, as taught in GREGERSON. One of ordinary skill in the art would have been motivated to show such marking information, as taught in GREGERSON, because the surgeon desires relevant feedback (i.e., where the needle and tip of the needle is located relative to target) during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that marking information can be shown in a 2D image. With respect to claim 8, GREGERSON teaches that the method further includes, in the three-dimensional image of the medical image, performing superimposed display of: a planned movement path of the medical accessory, and/or information indicating a relative positional relationship between the medical accessory and a starting point and/or a target point of the movement path, and/or a predicted path determined according to the position and an extension direction of the medical accessory, and/or the camera image. “The graphical representation of the tool 509 may illustrate the position and orientation of the tool relative to the anatomic features depicted in the 3D volume rendering 507.” ([0061]). “In embodiments, the display screen 500 may display graphical element(s) overlaying the image data corresponding to one or more target positions and/or trajectories that are set by the user.” ([0072]). Moreover, “[i]n some embodiments, the user may be able to make the superimposed image data (e.g., 3D volume rendering 734) more or less transparent relative to the camera images (e.g., real-time video images) shown on the display screen 500. A slider 735 or similar graphical interface element on the display screen 500 (e.g., a touchscreen display) may be used to adjust the relative transparency of the 3D volume rendering relative to the camera images, as shown in FIG. 7F.” ([0089]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to superimpose, in the 3D volume, a planned movement path, a relative positional relationship between the accessory and a target point, or a predicted path, as taught in GREGERSON. One of ordinary skill in the art would have been motivated to superimpose such information, as taught in GREGERSON, because the surgeon would like to visualize a plan for the procedure and/or a position of the tool during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that information may be displayed with the 3D volume. With respect to claim 18, GREGERSON teaches that the medical image includes a three-dimensional image and/or a two-dimensional image displayed on one or a plurality of planes and wherein the display unit displays in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory. GREGERSON teaches that the medical image comprises a three-dimensional image and/or a two-dimensional image displayed on one or a plurality of planes. “[A] first display mode may include displaying a 3D image dataset (e.g., an x-ray CT, MRI, sonogram, PET or SPECT image dataset) in multiple two-dimensional slices corresponding to anatomic planes (e.g., axial, sagittal, coronal planes) transecting the patient.” ([0060]). GREGERSON also teaches displaying in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory. “In block 307 of method 300, images of the patient’s anatomy from the first image dataset may be displayed with an overlay of one or more features derived from the second image dataset in the common coordinate system…The images of the patient’s anatomy may include 2D slices of a three-dimensional image dataset (e.g., a tomographic reconstruction)….” ([0048]). “The one or more features derived from the second image dataset that may be displayed overlaying the images of the patient’s anatomy may include graphical depictions of a tool 104, an end effector 102 or another object that is tracked by the motion tracking system 105…The graphical depiction may be a rendering of the actual size and shape of the object or may be a depiction of select features of the object, such as a location of a tip end of the object and/or an orientation of the object.” ([0049]). Regarding the images being in “real time,” Examiner is interpreting subsequent as teaching real-time. “The motion tracking system 105 may repeatedly acquire new images from the optical sensing device 111, and the relative positions and/or orientations of objects within the field of view of the optical sensing device 111 may be updated with each acquisition of new images from the optical sensing device 111...The display device 121 may be updated to reflect any change(s) in the position and/or orientation of the objects within the common coordinate system (e.g., relative to the patient reference arc 115)….” ([0050]). As explained above with respect to claims 5 and 6, it would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system display in real time, according to the position of the medical accessory, the two-dimensional image corresponding to the position of the medical accessory as taught in GREGERSON. One of ordinary skill in the art would have been motivated to show the different scan planes relative to the tool and the target, as taught in GREGERSON, because the surgeon desires relevant feedback (i.e., where the needle and tip of the needle is located relative to target) during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that the tool can be shown in different 2D images of the anatomy. With respect to claim 19, GREGERSON teaches that wherein in the three-dimensional image of the medical image, the display unit performs superimposed display of: a planned movement path of the medical accessory, and/or information indicating a relative positional relationship between the medical accessory and a starting point and/or a target point of the movement path, and/or a predicted path determined according to the position and an extension direction of the medical accessory, and/or the camera image. “The graphical representation of the tool 509 may illustrate the position and orientation of the tool relative to the anatomic features depicted in the 3D volume rendering 507.” ([0061]). “In embodiments, the display screen 500 may display graphical element(s) overlaying the image data corresponding to one or more target positions and/or trajectories that are set by the user.” ([0072]). Moreover, “[i]n some embodiments, the user may be able to make the superimposed image data (e.g., 3D volume rendering 734) more or less transparent relative to the camera images (e.g., real-time video images) shown on the display screen 500. A slider 735 or similar graphical interface element on the display screen 500 (e.g., a touchscreen display) may be used to adjust the relative transparency of the 3D volume rendering relative to the camera images, as shown in FIG. 7F.” ([0089]). As explained above with respect to claim 8, it would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to superimpose, in the 3D volume, a planned movement path, a relative positional relationship between the accessory and a target point, or a predicted path, as taught in GREGERSON. One of ordinary skill in the art would have been motivated to superimpose such information, as taught in GREGERSON, because the surgeon would like to visualize a plan for the procedure and/or a position of the tool during the procedure. There would have been a reasonable expectation of success as GREGERSON teaches that information may be displayed with the 3D volume. Claim 15 is rejected under 35 U.S.C. 103 as being unpatentable over Li, Jing, et al. “A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation.” Computers in Biology and Medicine 136 (2021) (hereinafter “JING LI”) as applied to claim 14 above, and further in view of U.S. Patent Appl. Publ. No. 2017/0079722 A1 (hereinafter “O’GRADY”). With respect to claim 15, JING LI does not explicitly teach wherein when a scanned subject moves from a first position in which scanning is performed to a second position, further reconstructing the medical image according to a relative positional relationship between the first position and the second position. In the same field of endeavor, O’GRADY teaches methods and systems for registering a manipulator assembly and independently positionable surgical table. (Abstract). “In one aspect, methods include reading a fiducial marker on the surgical table with a sensor associated with the manipulator assembly and localizing the manipulator assembly and surgical table with respect to a common reference frame.” (Abstract). O’GRADY clearly concerns surgical navigation. “While viewing a two or three dimensional image of the surgical site on a display, the surgeon performs the surgical procedures on the patient by manipulating master control devices, which in turn control motion of the servo-mechanically operated instruments.” ([0004]; see also, e.g., [0010]). “[I]t would be desirable for such manipulator systems to have a means by which the surgical table can be ‘localized’ with the manipulator assembly such that a spatial relationship between the surgical table and the manipulator assembly can be determined and utilized in calculating movement of the surgical manipulators.” ([0010]). The surgical tools contemplated by O’GRADY include minimally-invasive devices that are inserted percutaneously into the patient, such as cannula and trocars. ([0055]). To this end, O’GRADY describes “[m]ethods of localization include reading one or more fiducial markers, such as 2D barcodes, disposed on a surgical table and converting a 3D pose of the surgical table to a 2D frame of reference common to the manipulator assembly and determining a spatial relationship between the surgical table and the manipulator assembly. Such methods may utilize a sensor, such as an optical sensor or camera, disposed within a base of the manipulator assembly such that when the surgical table is positioned in close proximity the surgical table can be localized relative the manipulator assembly with respect to the ground plane such that a spatial relationship between the pose of the surgical table and the manipulator assembly can be determined.” ([0011]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to account for when a patient moves from a first position in which scanning is performed to a second position where the surgery is performed and then further reconstruct the medical image according to a relative positional relationship between the first position and the second position. One having ordinary skill would be motivated to include this capability in order to permit movement of the patient during the surgical procedure (i.e., move from pre-operative/intraoperative imaging to a surgical position) as it increases the utility of the system. There would have been a reasonable expectation of success as O’GRADY teaches that movement of a patient table, such as a CT table, can be accounted for during surgery. Claim 12 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Li, Jing, et al. “A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation.” Computers in Biology and Medicine 136 (2021) (hereinafter “JING LI”) as applied to claims 1 and 11 above, and further in view of U.S. Patent Appl. Publ. No. 2018/0014890 A1 (hereinafter “STANTON”). With respect to claim 12 (depending from claim 11), JING LI does not teach that the marker is a cannula sleeved on a peripheral side of the puncture needle. PNG media_image5.png 825 653 media_image5.png Greyscale However, in the same field of endeavor, STANTON teaches a multi-stage dilator and cannula assembly for minimally-invasive surgical procedures (Abstract) including those that are used for motion tracking and surgical navigation. ([0024]). The cannula assembly can be used in different surgical procedures including “orthopedic, neurological, cardiothoracic, and general surgical procedures.” ([0026]). STANTON considers the cannula assembly capable of delivering a variety of surgical tools including “a tool for gripping or cutting, an electrode,…a radiation source…..” ([0035]). As shown in Figure 1B, the cannula assembly 100 includes “a plurality of elongated members 101, 103, 105 in a nested configuration such that the members 101, 103 and 105 may slide relative to one another along a longitudinal axis, a.” ([0015[). The first member 101 may be a needle. ([0015]). The cannula assembly also includes “a marker device 119 which may be used for a motion tracking/surgical navigation system.” ([0024]). The marker device 119 includes a set of markers 121. ([0025]). “The markers 121 may be secured to the support structure 123 to provide a fixed, known geometric relationship of the markers 121 to each other and to the assembly 100, which may enable both the position (x, y, z) and the orientation (yaw; pitch, roll) of the assembly 100 to be fully resolved.” ([0025]). “For example, the marker device 119 may be used to track the motion of the multi-stage dilator and cannula assembly 100 as the first member 101 is advanced into the patient. Based on the tracked movement and known geometry of the assembly 100, the image guided surgery system may be used to determine when the tip of the first member 101 is located at a target position in the patient's body.” ([0030]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify or replace the ablation needle and use a nested cannula assembly, as taught in STANTON, having a marker that is a cannula sleeved on a peripheral side of the puncture needle. One of ordinary skill in the art would have been motivated to use the cannula assembly because the rigid outer cannula members can protect the thinner needles. There would have been a reasonable expectation of success as STANTON teaches that the cannula assembly can be used for various minimally-invasive procedures. With respect to claim 20 (depending from claim 16), JING LI discloses that the medical accessory comprises a puncture needle and a marker (See Figures 1-3 and Figure 14, each of which shows a needle attached to an array of reflective balls), wherein an extension direction of the marker forms a preset included angle with the extension direction of the puncture needle, and/or the marker is fixed at a preset position of the puncture needle. While JING LI does not explicitly state the above, one having ordinary skill in the art would recognize that the reflective balls (“marker”) form a pattern that necessarily forms a preset angle with the extension direction of the puncture needle and that is necessarily fixed at a preset position with respect to the puncture needle. Otherwise, the position of the needle could not be determined. As such, JING LI teaches the limitations of claim 11. However, JING LI does not teach that the marker is a cannula sleeved on a peripheral side of the puncture needle. As discussed above with respect to claim 12, it would have been obvious to one having ordinary skill in the art at the time of filing to modify or replace the ablation needle and use a nested cannula assembly, as taught in STANTON, having a marker that is a cannula sleeved on a peripheral side of the puncture needle. One of ordinary skill in the art would have been motivated to use the cannula assembly because the rigid outer cannula members can protect the thinner needles. There would have been a reasonable expectation of success as STANTON teaches that the cannula assembly can be used for various minimally-invasive procedures. Claims 3, 4, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Li, Jing, et al. “A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation.” Computers in Biology and Medicine 136 (2021) (hereinafter “JING LI”) as applied to claim 1 above and further in view of U.S. Patent Appl. Publ. No. 2023/0355347 A1 (hereinafter “GREGERSON”) and U.S. Patent Appl. Publ. No. 2025/0235263 A1 (hereinafter “HAUTVAST”). With respect to claim 3, JING LI does not teach generating a grid matrix according to the scanning result, wherein a starting point and/or a target point of a planned movement path of the medical accessory are/is marked in the grid matrix, and superimposing the grid matrix in the camera image. In the same field of endeavor, GREGERSON teaches methods and systems for performing computer-assisted image-guided surgery. GREGERSON is particularly concerned with minimally-invasive surgeries (see, e.g., [0074]) and various tools can be guided, including a needle (see, e.g., [0049]). Like JING LI, the image data can be pre-operative CT images (e.g., [0032]) and the surgical tools are tracked by an optical sensing device using markers. “The motion tracking system 105 in the embodiment of FIG. 1 includes a plurality of marker devices 119, 202 and 315 and a stereoscopic optical sensor device 111 that includes two or more cameras (e.g., IR cameras).” ([0036]). The system has a display device that displays image data of the patient’s anatomy. ([0040]). “The display device 121 may facilitate planning for a surgical procedure, such as by enabling a surgeon to define one or more target positions in the patient’s body and/or a path or trajectory into the patient’s body for inserting surgical tool(s) to reach a target position while minimizing damage to other tissue or organs of the patient.” ([0040]). PNG media_image6.png 303 630 media_image6.png Greyscale Figure 7F of GREGERSON shown here illustrates “a handheld display device 401 having a rear-facing camera (schematically illustrated by 732) that is configured to images of a patient 200 obtained by the camera 732 on a display screen 500.” ([0086]). “The patient marker(s) 731 may further enable registration of patient images (e.g., CT and/or MRI data) in a common coordinate system, as discussed above. In embodiments, the images from the camera 732 (e.g., real-time video images) may be overlaid with a three-dimensional volume rendering illustrating a “virtual” view of anatomic feature(s) (e.g., bony structures or other discrete internal anatomic features) as viewed from the current position and/or orientation of the handheld display device 401.” ([0087]). “[T]he user may be able to make the superimposed image data (e.g., 3D volume rendering 734) more or less transparent relative to the camera images (e.g., real-time video images) shown on the display screen 500.” ([0089]). In addition to image data, the system of GREGERSON can overlay or superimpose graphical elements representing “objects (e.g., tool(s), instrument(s)….)” with respect to the different coordinate systems (see, e.g., [0122], [0123], and [0132]). “The patient images may be displayed with an overlay or superimposition of graphical elements) showing the position and/or orientation of the one or more objects (e.g., tool(s), instrument(s), an end effector of a robotic arm) that are tracked by the motion tracking system 105, where the one or more objects may be shown within the blended or interpolated patient coordinate system.” ([0132]). In the same field of endeavor, HAUTVAST teaches a method for determining a virtual position of a virtual guidance device, such as a grid template, for guiding a needle, such as a virtual ablation applicator, into a patient. (Abstract). The medical image may be a pre-operative CT image. “Ablation applicators are used in ablation procedures like a percutaneous thermal ablation procedure. A percutaneous thermal ablation procedure uses one or several applicators, which are generally shaped as needles, for applying ablation energy to an ablation target like a tumor.” ([0002]). “Before carrying out the ablation procedure, the ablation procedure can be planned by using a pre-planning application. The planning is based on a pre-operative radiology image and can be used to answer questions like whether the patient is eligible for percutaneous thermal ablation and which type of ablation energy and how many applicators should be used.” ([0004]). Figure 3 of HAUTVAST shows a display that includes the virtual guidance device (i.e., the grid). After determining the virtual position, “[t]he determined virtual position can then be shown on a display 31. In this embodiment, also a representation of the prostate 11 and the tumor 15 and the virtual applicator 112 are shown on the display 31 as illustrated in FIG. 3. Thus, the planning apparatus 1 can be configured to generate a view of the determined virtual position of the virtual guidance device 119, the virtual applicator 112, the prostate 11 and the tumor 15 and show the generated view on the display 31. The representation of the prostate 11 and the tumor 15 can be generated based on the provided image by using known techniques like segmentation.” ([0044]). Notably, “[i]n an embodiment, the real guidance device and the TRUS probe 40 are tracked by using known tracking technology like electromagnetic or optical tracking, wherein this tracking technology can be carried out by the ablation apparatus for determining the current position of the real guidance device.” ([0059]). It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to generate a grid matrix according to the scanning result and superimpose the grid matrix in the camera image based on the teachings of GREGERSON and HAUTVAST. Specifically, one having ordinary skill in the art would generate an ablation or biopsy plan using a virtual grid matrix, as taught in HAUTVAST, and superimpose the virtual grid matrix over the patient in the camera image, as taught in GREGERSON, while developing the plan. One would be motivated to modify the system is this manner to increase the utility of the system and enable the surgeon to better visualize the plan. There would have been a reasonable expectation of success as GREGERSON and HAUTVAST teach that the system can be used to generate surgical plans using superimposed tools and objects. Furthermore, it would have been obvious to one having ordinary skill in the art at the time of filing to further modify the JING LI system to mark a starting point and/or target point of the planned movement path in the grid matrix based on the teachings of GREGERSON and HAUTVAST. Because the entire exercise is to prepare a surgical plan that includes a planned movement path, necessary parts of the planned movement path include the point of insertion of the needle into the patient, which necessarily determines the starting point within the virtual grid matrix, and a target point (e.g., the tumor within the patient). There would have been a reasonable expectation of success as GREGERSON and HAUTVAST teach that the system can be used to generate surgical plans using superimposed tools and objects. With respect to claim 4 (depending from claim 3), JING LI does not teach wherein the method further includes displaying on the grid matrix in real time the position of the medical accessory and/or information indicating a relative positional relationship between the medical accessory and the starting point and/or the target point. It would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to display on the grid matrix in real time the position of the medical accessory and/or information indicating a relative positional relationship between the medical accessory and the starting point and/or the target point based on the teachings of GREGERSON and HAUTVAST. Specifically, one having ordinary skill in the art would be motivated to display a graphical representation of the medical accessory relative to the grid matrix during the surgical procedure. There would have been a reasonable expectation of success as GREGERSON and HAUTVAST teach that the system can be used during surgical procedures. With respect to claim 17, JING LI does not teach a generation unit which generates a grid matrix according to the scanning result, wherein a starting point and/or a target point of a planned movement path of the medical accessory is marked in the grid matrix and the display unit superimposing the grid matrix on the camera image. As discussed above with respect to claim 3, it would have been obvious to one having ordinary skill in the art at the time of filing to modify the JING LI system to generate a grid matrix according to the scanning result and superimpose the grid matrix in the camera image based on the teachings of GREGERSON and HAUTVAST. Specifically, one having ordinary skill in the art would generate an ablation or biopsy plan using a virtual grid matrix, as taught in HAUTVAST, and superimpose the virtual grid matrix over the patient in the camera image, as taught in GREGERSON, while developing the plan. One would be motivated to modify the system is this manner to increase the utility of the system and enable the surgeon to better visualize the plan. There would have been a reasonable expectation of success as GREGERSON and HAUTVAST teach that the system can be used to generate surgical plans using superimposed tools and objects. Furthermore, it would have been obvious to one having ordinary skill in the art at the time of filing to further modify the JING LI system to mark a starting point and/or target point of the planned movement path in the grid matrix based on the teachings of GREGERSON and HAUTVAST. Because the entire exercise is to prepare a surgical plan that includes a planned movement path, necessary parts of the planned movement path include the point of insertion of the needle into the patient, which necessarily determines the starting point within the virtual grid matrix, and a target point (e.g., the tumor within the patient). There would have been a reasonable expectation of success as GREGERSON and HAUTVAST teach that the system can be used to generate surgical plans using superimposed tools and objects. Prior Art of Made of Record The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Each of US 2021/0236207 A1; US 2024/0315798 A1; and US 2020/0196906 A1 disclose registering different coordinate systems to an origin or isocenter of CT gantry. US20100016710A1 discloses a virtual 3-d grid that is displayed to the user for generating a surgical plan. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON P GROSS whose telephone number is (571)272-1386. The examiner can normally be reached Monday-Friday 9:00-5:00CT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne M. Kozak can be reached at (571) 270-5284. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JASON P GROSS/ Examiner, Art Unit 3797 /SERKAN AKAR/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

May 31, 2024
Application Filed
Mar 28, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582472
SYSTEMS FOR DETERMINING SIZE OF KIDNEY STONE
2y 5m to grant Granted Mar 24, 2026
Patent 12514554
PRE-OPERATIVE ULTRASOUND SCANNING SYSTEM FOR PATIENT LIMB EXTENDING THROUGH A RESERVOIR
2y 5m to grant Granted Jan 06, 2026
Patent 12502157
ULTRASOUND SYSTEM HAVING A DISPLAY DEVICE WITH DYNAMIC SCROLL MODE FOR B-MODE AND M-MODE IMAGES
2y 5m to grant Granted Dec 23, 2025
Patent 12453602
ULTRASONIC PUNCTURE GUIDANCE PLANNING SYSTEM BASED ON MULTI-MODAL MEDICAL IMAGE REGISTRATION USING AN ITERATIVE CLOSEST POINT ALGORITHM
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 4 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
64%
Grant Probability
99%
With Interview (+62.5%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 14 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month