Prosecution Insights
Last updated: April 19, 2026
Application No. 18/787,842

3D Alignment in a 3D Coordinate System of an AR Headset Using 2D Reference Images

Non-Final OA §103
Filed
Jul 29, 2024
Examiner
THOMPSON, JAMES A
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Novarad Corporation
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 11m
To Grant
89%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
612 granted / 717 resolved
+23.4% vs TC avg
Minimal +4% lift
Without
With
+3.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
11 currently pending
Career history
728
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
54.4%
+14.4% vs TC avg
§102
25.0%
-15.0% vs TC avg
§112
7.9%
-32.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 717 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 2. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 3. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 4. Claims 1-23 are rejected under 35 U.S.C. 103 as being unpatentable over Gibby (US-2020/0159313) in view of Steinberg (US-2020/0405399). Regarding claim 1: Gibby discloses a method, comprising: receiving 2D (two dimensional) image data from a first medical imaging device (fig 1, fig 6, [0023]-[0025], and [0051] of Gibby – 2D image data captured for use by medical professional with AR headset), wherein the 2D image data has a location and graphical mark for a point representing a feature (fig 2(200), [0026], and [0030]-[0032] of Gibby), for use by an augmented reality (AR) headset ([0035]-[0036] of Gibby); computing a 3D coordinate of the point in the 3D coordinate system using a location and orientation of the first medical imaging device with respect to the AR headset (fig 7 and [0059]-[0060] of Gibby – position and orientation in 3D space of the medical imaging device 660 with respect to the body of the patient is used to modify how the medical image is displayed in the medical professional’s AR device, aligning in the appropriate orientation with respect to the body of the person 606a as viewed through AR headset); receiving a 3D image data set obtained from a second medical imaging device and a marker identifying the feature in the 3D image data set (fig 2(200), fig 8A, [0032]-[0033], and [0068]-[0070] of Gibby – a 3D image data set is received based on the marker 200 identifying various features for the 3d medical data imaging and based on imaging from an x-ray imaging device (second imaging device) aligned according to fluoroscopic imaging (first imaging device)); and aligning the 3D image data set to a body of a person by aligning the marker in the 3D image data set to the point in the 2D image data set ([0069]-[0070] of Gibby – aligned according to fluoroscopic imaging (first imaging device) and information provided by the visible marker/tag provided on the patient’s body). Gibby does not disclose the marker is a virtual marker. Steinberg discloses a virtual marker identifying the feature (figs 12A-12P, and [0345]-[0366] of Steinberg – various types of virtual markers used to identify a variety of features in the medical images). Gibby and Steinberg are analogous art because they art from the same field of endeavor, namely medical imaging. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to use specifically a virtual marker, as taught by Steinberg. The suggestion for doing so would have been that the imaging system is a virtual system (augmented reality), and thus using virtual markers rather than placing physical markers on the patient would be more efficient. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Gibby according to the relied-upon teachings of Steinberg to obtain the invention as specified in claim 1. Regarding claim 2: Gibby in view of Steinberg discloses the method as in claim 1 (as rejected above) where the feature is for patient anatomy including at least one of: a physical structure, physical patient anatomy or a distinctive feature (fig 4 and [0046]-[0048] of Gibby). Regarding claim 3: Gibby in view of Steinberg discloses the method as in claim 1 (as rejected above), wherein the point in the 2D image data and the virtual marker identified in the 3D image data set represent an anatomical landmark having a 3D coordinate (figs 3-4(310) and [0042] of Gibby – marker is virtual marker as per the combination with Steinberg). Gibby and Steinberg are combinable for the reasons set forth above with respect to claim 1. Regarding claim 4: Gibby in view of Steinberg discloses the method as in claim 1 (as rejected above), further comprising aligning the 3D image data set with the 2D image data as the 2D image data moves within the 3D coordinate system due to movement of a patient over time ([0037]-[0039], and [0049] of Gibby; [0402] of Steinberg – can be both with respect to movement of the medical professional (Gibby) or with respect to movement of the patient (Steinberg)). Gibby and Steinberg are combinable for the reasons set forth above with respect to claim 1. Regarding claim 5: Gibby in view of Steinberg discloses the method as in claim 1 (as rejected above), wherein the 2D image data is an ultrasound image (fig 8B and [0071]-[0073] of Gibby). Regarding claim 6: Gibby in view of Steinberg discloses the method as in claim 1 (as rejected above), wherein the 3D image data set is a CT (computed tomography) scan or an MRI (magnetic resonance imaging) scan ([0025]-[0026], and [0032] of Gibby). Regarding claim 7: Gibby discloses a method, comprising: receiving 2D (two dimensional) image data of patient anatomy from a first medical imaging device (fig 1, fig 6, [0023]-[0025], and [0051] of Gibby – 2D image data captured for use by medical professional with AR headset); identifying a location of the 2D image data in a 3D (three dimensional) coordinate system of an augmented reality (AR) headset (fig 2 (200,206) and [0030]-[0033] of Gibby – identify location of 2D image data corresponding to 2D markers 200, 206); identifying one or more points of a structure, point of anatomy or distinctive feature within the 2D image data (fig 4(400,402) and [0047] of Gibby – for example, incision points); computing a 3D coordinate of the point in the 3D coordinate system using a location and orientation of the first medical imaging device with respect to the AR headset (fig 7 and [0059]-[0060] of Gibby – position and orientation in 3D space of the medical imaging device 660 with respect to the body of the patient is used to modify how the medical image is displayed in the medical professional’s AR device, aligning in the appropriate orientation with respect to the body of the person 606a as viewed through AR headset); receiving a 3D image data set obtained from a second medical imaging device and a marker identifying an anatomical location in the 3D image data set (fig 2(200), fig 8A, [0032]-[0033], and [0068]-[0070] of Gibby – a 3D image data set is received based on the marker 200 identifying various features for the 3d medical data imaging and based on imaging from an x-ray imaging device (second imaging device) aligned according to fluoroscopic imaging (first imaging device)); and aligning the 3D image data set to a body of a person by aligning the marker in the 3D image data set to the point in the 2D image data set ([0069]-[0070] of Gibby – aligned according to fluoroscopic imaging (first imaging device) and information provided by the visible marker/tag provided on the patient’s body). Gibby does not disclose the marker is a virtual marker. Steinberg discloses a virtual marker identifying an anatomical location (figs 12A-12P, and [0345]-[0366] of Steinberg – various types of virtual markers used to identify a variety of features at various locations in the medical images). Gibby and Steinberg are analogous art because they art from the same field of endeavor, namely medical imaging. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to use specifically a virtual marker, as taught by Steinberg. The suggestion for doing so would have been that the imaging system is a virtual system (augmented reality), and thus using virtual markers rather than placing physical markers on the patient would be more efficient. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Gibby according to the relied-upon teachings of Steinberg to obtain the invention as specified in claim 7. Regarding claim 8: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), wherein the virtual marker identified in the 3D image data set is an anatomical landmark having a 3D coordinate (figs 3-4(310) and [0042] of Gibby – marker is virtual marker as per the combination with Steinberg). Gibby and Steinberg are combinable for the reasons set forth above with respect to claim 7. Regarding claim 9: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), further comprising aligning the 3D image data set with the 2D image data as the 2D image data moves within the 3D coordinate system due to movement of a patient over time ([0037]-[0039], and [0049] of Gibby; [0402] of Steinberg – can be both with respect to movement of the medical professional (Gibby) or with respect to movement of the patient (Steinberg)). Gibby and Steinberg are combinable for the reasons set forth above with respect to claim 7. Regarding claim 10: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), further comprising displaying 2D image data while viewing patient anatomy through the AR headset to generate a 3D spatial mapping and target an anatomical structure in the person (fig 1 and [0027] of Gibby). Regarding claim 11: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), further comprising displaying a procedure guidance indicator displaying a virtual trajectory or virtual path for a medical tool or augmentation tag identified by the AR headset to the 3D coordinate of the point (fig 4, [0027], and [0047] of Gibby). Regarding claim 12: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), wherein the 2D image data is an ultrasound image (fig 8B and [0071]-[0073] of Gibby). Regarding claim 13: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), wherein the first medical imaging device has an optical marker on a surface of the first medical imaging device to identify a location and orientation of the first medical imaging device and provide 3D coordinates of the 2D image data (fig 7(652,658), [0052], [0055], and [0059] of Gibby). Regarding claim 14: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), wherein a position and location of a medical imaging device is detectable using at least one physical markers, infrared markers or magnetic trackers (fig 7(652,658), [0052], [0055], and [0059] of Gibby). Regarding claim 15: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), wherein the 3D image data set is a CT (computed tomography) scan or an MRI (magnetic resonance imaging) scan ([0025]-[0026], and [0032] of Gibby). Regarding claim 16: Gibby in view of Steinberg discloses the method as in claim 7 (as rejected above), where in an orientation and size of the 2D image data is determined ([0061]-[0064] of Gibby – determines orientation of the 2D image data ([0061]-[0062]), along with the size, which is used for zoom adjustments ([0063]-[0064])). Regarding claim 17: Gibby discloses a method, comprising: receiving a plurality of 2D (two dimensional) images of patient anatomy representing non-parallel projections from a medical imaging device (fig 1, figs 6-7, fig 8A, [0023]-[0025], [0051], [0059], and [0068] of Gibby – multiple 2D image data sets, fluoroscopic and x-ray, captured for use by medical professional with AR headset at different orientations); identifying a position and orientation of an imaging device in a 3D (three dimensional) coordinate system with relation to a person and an augmented reality (AR) headset when capturing the plurality of 2D images (fig 7 and [0059]-[0060] of Gibby – position and orientation in 3D space of the medical imaging device 660 with respect to the body of the patient is used to modify how the medical image is displayed in the medical professional’s AR device, aligning in the appropriate orientation with respect to the body of the person 606a as viewed through AR headset); receiving at least two graphical marks from a user for a common anatomical structure in the plurality of 2D images (fig 2(200,206) and [0032]-[0037] of Gibby – multiple markers 200, 206 captured in 2D images at different orientations and locations); determining a 3D coordinate of the common anatomical structure in the 3D coordinate system with respect to a patient by taking an intersection of projected lines from an X-ray source to the at least two graphical marks in the plurality of 2D images (figs 6-7, fig 8A, [0060]-[0061], and [0067]-[0068] of Gibby – 3D coordinates of anatomical structures determine according to graphical markers and indicators, and the alignments of the image data according to the image projection); identifying a 3D image data set obtained from a 3D medical imaging device that contains a marker identifying an anatomical location in the 3D image data set (fig 2 (200), fig 8A, [0032]-[0033], and [0068]-[0070] of Gibby – a 3D image data set is received based on the marker 200 identifying various features/anatomical locations for the 3d medical data imaging and based on imaging from an x-ray imaging device aligned according to fluoroscopic imaging); and aligning the 3D image data set to a body of a person by using the marker in the 3D image data set and the intersection for the plurality of 2D images, using the AR headset ([0069]-[0070] of Gibby – aligned according to fluoroscopic imaging (first imaging device) and information provided by the visible marker/tag provided on the patient’s body, as viewed through the AR display). Gibby does not disclose the marker is a virtual marker. Steinberg discloses a virtual marker identifying an anatomical location (figs 12A-12P, and [0345]-[0366] of Steinberg – various types of virtual markers used to identify a variety of features at various locations in the medical images). Gibby and Steinberg are analogous art because they art from the same field of endeavor, namely medical imaging. Before the effective filing date of the invention, it would have been obvious to one of ordinary skill in the art to use specifically a virtual marker, as taught by Steinberg. The suggestion for doing so would have been that the imaging system is a virtual system (augmented reality), and thus using virtual markers rather than placing physical markers on the patient would be more efficient. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify Gibby according to the relied-upon teachings of Steinberg to obtain the invention as specified in claim 17. Regarding claim 18: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), further comprising aligning the 3D image data set with 2D image data as the 2D image data moves within the 3D coordinate system due to movement of the person over time ([0037]-[0039], and [0049] of Gibby; [0402] of Steinberg – can be both with respect to movement of the medical professional (Gibby) or with respect to movement of the patient (Steinberg)). Gibby and Steinberg are combinable for the reasons set forth above with respect to claim 17. Regarding claim 19: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), further comprising displaying 2D images from the medical imaging device while viewing patient anatomy using the AR headset, in order to generate a 3D spatial mapping (fig 1 and [0027] of Gibby). Regarding claim 20: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), wherein a graphical mark is at least one of a point, line, navigational track, 2D area, or 3D area target (fig 2(200,206) and [0033] of Gibby). Regarding claim 21: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), further comprising providing a navigation path from a medical tool identified by the AR headset to the 3D coordinate of the common anatomical structure (fig 4 and [0047] of Gibby). Regarding claim 22: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), wherein the 2D images are aligned by using a virtual line placed on common anatomy in each of the 2D images (fig 4 and [0047] of Gibby). Regarding claim 23: Gibby in view of Steinberg discloses the method as in claim 17 (as rejected above), wherein the 2D images are at least one of: X-ray generated images, ultrasound images, CT generated images, or MRI generated images ([0025]-[0026], and [0032] of Gibby). Allowable Subject Matter 5. Claims 24-30 are allowed. The following is an examiner’s statement of reasons for allowance: Claim 24 recites a “method for registering X-ray generated images in 3D coordinate space while viewing a patient using an augmented reality (AR) headset, comprising: receiving a plurality of X-ray generated images of patient anatomy; aligning the plurality of X-ray generated images using an optically visible marker and associated image visible marker on a patient and the associated image visible marker is represented in the X-ray generated image; receiving a first point mark-up for annotating an anatomic structure on a first X-ray generated image and a second point mark-up in a second X-ray generated image, wherein the first point mark-up and second point mark-up represent common anatomical structure in the patient or a medical procedure trajectory; aligning the first point mark-up and the second point mark-up to generate an intersection using the first X-ray generated image and second X-ray generated image viewable using the AR headset; and identifying a 3D image data set obtained from a 3D medical imaging device that contains a virtual marker identifying the common anatomical structure in the 3D image data set; and aligning the 3D image data set to the common anatomical structure by aligning the virtual marker in the 3D image data set to the intersection for the plurality of X-ray generated images, using the AR headset.” Some features of claim 24 are taught by the prior art. Gibby discloses a method for registering X-ray generated images in 3D coordinate space while viewing a patient using an augmented reality (AR) headset (figs 8A-9B, [0064], [0068]-[0073], and [0079]-[0083] of Gibby), comprising: receiving a plurality of X-ray generated images of patient anatomy (figs 6-7, fig 8A, and [0023]-[0025], [0051], [0059], and [0068] of Gibby – multiple 2D image data sets, fluoroscopic and x-ray, captured for use by medical professional with AR headset at different orientations); aligning the plurality of X-ray generated images using an optically visible marker and associated image visible marker on a patient and the associated image visible marker is represented in the X-ray generated image (fig 8A (808,830), [0063], and [0065]-[0069] of Gibby – multiple x-ray and multiple fluoroscopic images aligned according to optical codes 808 on the body of the patient and radiopaque markers 830); receiving a first point mark-up for annotating an anatomic structure on a first X-ray generated image and a second point mark-up in a second X-ray generated image, wherein the first point mark-up and second point mark-up represent common anatomical structure in the patient or a medical procedure trajectory (fig 4(400,402,404) and [0047] of Gibby – first point mark-up 400,402 and second point mark-up 404 represent medical procedure trajectory for procedure performed on a particular anatomical structure). Steinberg discloses a virtual marker identifying an anatomical structure (figs 12A-12P, and [0345]-[0366] of Steinberg – various types of virtual markers used to identify a variety of features at various locations in the medical images). The combination of Gibby and Steinberg does not disclose aligning the first point mark-up and the second point mark-up to generate an intersection using the first X-ray generated image and second X-ray generated image viewable using the AR headset; and identifying a 3D image data set obtained from a 3D medical imaging device that contains a marker identifying the common anatomical structure in the 3D image data set; and aligning the 3D image data set to the common anatomical structure by aligning the marker in the 3D image data set to the intersection for the plurality of X-ray generated images, using the AR headset. Furthermore, Examiner has not discovered any additional prior art which fully teaches claim 24, either singly or in an obvious combination. Additional relevant prior art includes: Tolkowsky (US-2024/0307122), Denissen (US-2023/0404678), Braido (US-2023/0157757), and Nikou (US-2023/0019543). However, none of the prior art cited above fully teaches claim 24, either singly or in an obvious combination. Therefore, claim 24 distinguishes over the prior art. Claims 25-30 each distinguish over the prior art at least due to their respective dependencies. Claims 24-30 each distinguish over the prior art, and there are no outstanding grounds of rejection or objection with respect to claims 24-30. Accordingly, claims 24-30 are allowed. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to James A Thompson whose telephone number is (571)272-7441. The examiner can normally be reached M-F 8am-6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES A THOMPSON/Primary Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

Jul 29, 2024
Application Filed
Feb 10, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602893
VOLUMETRIC HEAT MAPS FOR INTELLIGENTLY DISTRIBUTING VIRTUAL OBJECTS OF INTEREST
2y 5m to grant Granted Apr 14, 2026
Patent 12592011
Digital Representation of Intertwined Vector Objects
2y 5m to grant Granted Mar 31, 2026
Patent 12585379
Methods and systems for generating image tool recommendations
2y 5m to grant Granted Mar 24, 2026
Patent 12579681
METHOD AND DEVICE FOR DRAWING SPATIAL MAP, CAMERA EQUIPMENT AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12579743
ALIGNED AUGMENTED REALITY VIEWS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
89%
With Interview (+3.7%)
2y 11m
Median Time to Grant
Low
PTA Risk
Based on 717 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month