Prosecution Insights
Last updated: April 19, 2026
Application No. 18/191,110

METHOD FOR CORRECTING ARTIFACTS IN A COMPUTED TOMOGRAPHY IMAGE DATA SET, COMPUTED TOMOGRAPHY FACILITY, COMPUTER PROGRAM AND ELECTRONICALLY READABLE DATA CARRIER

Final Rejection §102§112
Filed
Mar 28, 2023
Examiner
THIRUGNANAM, GANDHI
Art Unit
2672
Tech Center
2600 — Communications
Assignee
Siemens Healthcare GmbH
OA Round
2 (Final)
74%
Grant Probability
Favorable
3-4
OA Rounds
3y 7m
To Grant
86%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
413 granted / 559 resolved
+11.9% vs TC avg
Moderate +12% lift
Without
With
+12.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
42 currently pending
Career history
601
Total Applications
across all art units

Statute-Specific Performance

§101
9.6%
-30.4% vs TC avg
§103
35.8%
-4.2% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
27.1%
-12.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 559 resolved cases

Office Action

§102 §112
DETAILED ACTION Claim Objections Claims 1-25 are objected to because of the following informalities: Claim 1 recites “and the ascertaining including at least one of ascertaining or adapting a first function that describes a progression of the at least one artifact with respect to a transverse direction perpendicular to the longitudinal direction of the at least substantially needle-shaped metal object in a longitudinal extension plane of the at least substantially needle-shaped metal object; ”. This claim should recite “wherein the ascertaining comprises one of the following: Ascertaining1 a first function ; Adapting a first function; or Ascertaining a first function, and adapting the first function; , wherein the first function describes a progression of the at least one artifact with respect to a transverse direction perpendicular to the longitudinal direction of the at least substantially needle-shaped metal object in a longitudinal extension plane of the at least substantially needle-shaped metal object;” Claim 3, 7, 8 is objected to for similar reasons as claim 1. Claim 15 is rejected under similar reasoning as claim 1. Claims 2-14,16-25 are rejected as dependent upon a rejected claim. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-25 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites “a first function that describes a progression”. It is not clear what the scope of this term means. The most relevant dictionary definitions (https://www.merriam-webster.com/dictionary/progression)include 1: a sequence of numbers in which each term is related to its predecessor by a uniform law 2a: the action or process of progressing : advance b: a continuous and connected series : sequence. Paragraph 18 states “[0018] Thus, a first, particularly preferred specific embodiment of the present invention provides that the artifact data set is ascertained at least partially by ascertaining and/or adapting at least one progression function, which describes the progression of the artifact that starts from the tip of the metal object in at least one direction in the image space, on the basis of image data, lying in the artifact region, of the reconstructed computed tomography image data set. In this context, in this embodiment, the prior knowledge can be included in the choice of the functional form of the at least one progression function or the parametrization thereof. In particular, it is conceivable when fitting image data or information derived therefrom to the at least one progression function to predefine boundary conditions, which considerably simplify the search for an optimum fit. “ Paragraph 20 mentions “HU value progression values”. Additionally paragraph 55 discloses ”it is possible to directly derive the progressions for the artifact8, as the anticipated value for the attenuation value (HU value)”. This implies the progression are the sequence of HU values of the artifact, which represent the boundary (contour) of the artifact. Other than the implied definition above, it is not clear what is the full scope of the word “progression”. If Applicant agrees that the definition of “progression” is limited to only “a sequence of HU values of the artifact which represents the boundary/contour of the artifact”, this rejection will be withdrawn. Otherwise, the Examiner requests a more definite definition of this term. Claim 15 is rejected under similar reasoning as claim 1. Claims 2-14,16-25 are rejected as dependent upon a rejected claim. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-3, 10, 11-15, 19-20, 25 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Xu (PGPub 2021/0056688). Xu discloses 1. A method for correcting artifacts in a computed tomography image data set of a recording region in which an at least substantially needle-shaped metal object is located, the computed tomography image data set being reconstructed from projection images recorded at least partially such that the at least substantially needle-shaped metal object is irradiated at least substantially in a longitudinal direction, and the method comprising: ascertaining an artifact data set describing at least one artifact in an image space, the atleast one artifact being caused by the at least substantially needle-shaped metal object, the ascertaining being based on prior knowledge about the at least one artifact; and (Xu, Fig. 1,#34, “Metal artifact image”) and the ascertaining including at least one of ascertaining or adapting a first function that describes a progression of the at least one artifact with respect to a transverse direction perpendicular to the longitudinal direction of the at least substantially needle-shaped metal object in a longitudinal extension plane of the at least substantially needle-shaped metal object;(Xu, paragraph 33-34, “[0033] With reference now to FIG. 2, to generate training sets, mono- and poly-chromatic projections (or, equivalently, mono- and poly-energetic projections) of digital phantoms containing metal objects were simulated. As shown in FIG. 2, CNN training sets were generated from a digital phantom that contained either a surgical screw 50 within the transaxial plane (a: left-hand image of FIG. 2) or two metal rod implants 52, 54 along the craniocaudal direction (b: right-hand image of FIG. 2). The grayscale window was [−400, 400] HU. …. Two scenarios were considered: (i) the presence of the surgical screw 50 within the transaxial plane (left-hand image of FIG. 2); and (ii) the presence of two metal rod implants 52, 54 along the craniocaudal direction (right-hand image of FIG. 2). The digital phantom also contains a water ellipse 56 (major axis ˜150 mm, minor axis ˜120 mm) to simulate body attenuation. A circular insert (diameter ˜50 mm, attenuation 100 HU higher than water) was also added to examine the performance of the proposed method in the presence of relatively low contrast object. The metal material was assumed to be Titanium in the simulations. The monochromatic projections were simulated assuming an effective energy of 71 kV of the incident x-ray spectrum. The poly-chromatic projections were simulated according to: PNG media_image1.png 38 392 media_image1.png Greyscale where I.sub.0(E) denotes the incident x-ray spectrum as a function of photon energy E, I is total transmitted intensity, and l is path length computed using a custom Graphical Processor Unit (GPU)-based forward projector. The simulated mono- and poly-chromatic projections were then reconstructed using three-dimensional (3D) filtered-backprojection (FBP) to form “Mono” (regarded as ground truth) and “Poly” images (containing metal artifacts) respectively.”,where the neural network is trained using images ascertained from eqn. 2, these images representing “surgical screw 50” or “metal rod implants 52,54 along a craniocaudal direction”, where the HU value of a range of [-400,400] is used, implying the images are also in HU, where prior knowledge includes but is not limited to the type of material, in this case titanium and it’s associated properties, the ascertained artifact data set is metal artifactimage) subtracting the artifact data set from the computed tomography image data set. (Xu, Fig. 1 #36 & #40, subtracting the metal artifact image from the uncorrected image to get the corrected image) *NOTE: Additionally there is a loss function with is constantly updated during the training process (see paragraph 31), thus also anticipates the adapting limitation. Xu discloses 2. The method as claimed in claim 1, wherein during intervention monitoring with the at least substantially needle-shaped metal object as an intervention instrument at least one of The method is performed for each recorded computed tomography image data set of a monitoring series, an image plane of the computed tomography image data set is a longitudinal extension plane of the at least substantially needle-shaped metal object, or the at least substantially needle-shaped metal object is an intervention needle. (Xu, paragraph 40, “As noted above, in experiments the CNN correction speed was about 80 images per second, which is practical for use in correcting “live” images generated by a C-arm 10 (e.g. FIG. 1) during an iGT procedure. Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. In some embodiments, to maximize processing speed for live imaging during iGT or other time-critical imaging tasks, the image reconstruction method 26 does not include any metal artifact correction other than by applying the neural network 32 to the uncorrected X-ray image 30 to generate the metal artifact image 34 and generating the corrected X-ray image 40 by subtracting the metal artifact image from the uncorrected x ray image.”), Xu discloses 3. The method as claimed in claim 1, wherein the ascertaining the artifact data set includes at least one of ascertaining or adapting at least one second function describing a progression of the at least one artifact that starts from a tip of the at least substantially needle- shaped metal object in at least one direction in the image space, the at least one of ascertaining or adapting based on image data, lying in an artifact region, of the computed tomography image data set, the at least one second function including the first function, and the at least one of ascertaining or adapting the at least one second function being based on first image data in an artifact region of the computed tomography image data set.(see claim 1, where there are a plurality of training images thus a plurality of functions) Claim 4 is rejected under similar reasoning as claim 1 and 3. Claims 6-7 are rejected under similar reasoning as claim 3-4. Claims 8-9 are rejected under similar reasoning as claim 3-4. Xu discloses 10. The method as claimed in claim 1, wherein the ascertaining of the artifact data set comprises: at least one of choosing or determining the artifact data set, at least partially from reference data sets present in a database, for at least one of various reference metal objects or various parameters of at least one of the at least substantially needle-shaped metal object or the various reference metal objects. (Xu, paragraph 40, “Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. I”) Xu discloses 11. The method as claimed in claim 10, wherein the reference data sets are based on learning measurements of at least one of the at least substantially needle-shaped metal object or at least one reference metal object of a same type in a phantom. (Xu, paragraph 40, “Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. I”) Xu discloses 19. The method as claimed in claim 11, wherein the phantom is a structureless phantom.(addressed the alternative) Xu discloses 20. The method as claimed in claim 19, wherein the structureless phantom is a water phantom. (addressed the alternative) Xu discloses 25. The method as claimed in claim 11, wherein at least part of the ascertaining of the artifact data set is performed via the database based on recording parameters that deviate with regard to the reference data sets. (Xu, paragraph 40, “Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. I”) Xu discloses 12. The method as claimed in claim 10, wherein at least part of the ascertaining of the artifact data set is performed via the database based on recording parameters that deviate with regard to the reference data sets. (Xu, paragraph 40, “Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. I”) Xu discloses 13. A computed tomography device including a control device configured to perform the method as claimed in claim 1. (see claim 1 above) Xu discloses 14. A non-transitory computer-readable medium storing computer-executable instructions that, when executed at a control device of a computed tomography device, cause the computed tomography device to perform the method of claim 1. (see claim 1 above) Xu discloses 15. A computed tomography device comprising: a memory storing computer-executable instructions; and at least one processor configured to executed the computer-executable instructions to cause the computed tomography device (Xu, Abstract) to ascertain an artifact data set describing, in an image space, at least one artifact caused by an at least substantially needle-shaped metal object, wherein the artifact data set is ascertained based on prior knowledge about the at least one artifact, and (Xu, Fig. 1,#34, “Metal artifact image”; paragraph 40, “As noted above, in experiments the CNN correction speed was about 80 images per second, which is practical for use in correcting “live” images generated by a C-arm 10 (e.g. FIG. 1) during an iGT procedure. Furthermore, as seen in FIGS. 3 and 4, the metal artifact image (second column from left in FIGS. 3 and 4) can provide effectively segmented representation of the metal artifact. Although this image exhibits blooming or other distortion compared with the actual boundaries of the metal object causing the artifact, it is seen that the metal artifact image provides an isolation image of the metal object that can, for example, be fitted to a known metal object geometry to provide for accurate live tracking of a biopsy needle, metal prosthesis, or other known metal object that is to be manipulated during the iGT procedure. In one approach, the corrected X-ray image 40 is displayed on the display 46 with the metal artifact image 34 (or an image derived from the metal artifact image 34, such as an image of the underlying metal object positioned to be spatially registered with the metal artifact image 34) is also displayed on the display 46, e.g. superimposed onto or otherwise fused with the display of the corrected X-ray image 40. As another application, the density of the image of the metal object captured in the metal artifact image 34 (or other information such as the extent of blooming) may be used to classify the metal object as to metal type, or the metal object depicted by the metal artifact image 34 may be identified based on shape, and/or so forth. In some embodiments, an identification approach such as one disclosed in Walker et al., U.S. Pub. No. 2012/0046971 A1 (published Feb. 23, 2012) may be used. In some embodiments, to maximize processing speed for live imaging during iGT or other time-critical imaging tasks, the image reconstruction method 26 does not include any metal artifact correction other than by applying the neural network 32 to the uncorrected X-ray image 30 to generate the metal artifact image 34 and generating the corrected X-ray image 40 by subtracting the metal artifact image from the uncorrected x ray image.” ) subtract the artifact data set from a computed tomography image data set of a recording region in which the at least substantially needle-shaped metal object is located, to correct for artifacts in the computed tomography image data set, (Xu, Fig. 1 #36 & #40, subtracting the metal artifact image from the uncorrected image to get the corrected image) wherein the computed tomography image data set is reconstructed from projection images recorded at least partially such that the at least substantially needle-shaped metal object is irradiated at least substantially in a longitudinal direction. (Xu, Fig. 1 #16) Claims 17-20 are rejected based on Xu (paragraph 33, Fig. 2) Claims 21-25 are rejected under similar grounds as claims 3-4. No Prior Art reads on claims 5, 12, 25. These claims are currently rejected under 35 USC 112 2nd paragraph. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to GANDHI THIRUGNANAM whose telephone number is (571)270-3261. The examiner can normally be reached M-F 8:30-5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sumati Lefkowitz can be reached at 571-272-3638. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /GANDHI THIRUGNANAM/Primary Examiner, Art Unit 2672 1 The BRI of ascertaining includes : determining, finding, learning, discovering, learning,
Read full office action

Prosecution Timeline

Mar 28, 2023
Application Filed
Aug 09, 2025
Non-Final Rejection — §102, §112
Oct 21, 2025
Applicant Interview (Telephonic)
Oct 22, 2025
Examiner Interview Summary
Nov 12, 2025
Response Filed
Feb 16, 2026
Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597135
SYSTEMS AND METHODS FOR UPDATING A GRAPHICAL USER INTERFACE BASED UPON INTRAOPERATIVE IMAGING
2y 5m to grant Granted Apr 07, 2026
Patent 12561963
CROSS-MODALITY NEURAL NETWORK TRANSFORM FOR SEMI-AUTOMATIC MEDICAL IMAGE ANNOTATION
2y 5m to grant Granted Feb 24, 2026
Patent 12555291
METHOD FOR AUTOMATED REGULARIZATION OF HYBRID K-SPACE COMBINATION USING A NOISE ADJUSTMENT SCAN
2y 5m to grant Granted Feb 17, 2026
Patent 12541869
GRAIN FLAKE MEASUREMENT SYSTEM, GRAIN FLAKE MEASUREMENT METHOD, AND GRAIN FLAKE COLLECTION, MOVEMENT, AND MEASUREMENT SYSTEM
2y 5m to grant Granted Feb 03, 2026
Patent 12525007
TRAINING METHOD AND ELECTRONIC DEVICE
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
74%
Grant Probability
86%
With Interview (+12.3%)
3y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 559 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month