Prosecution Insights
Last updated: April 19, 2026
Application No. 18/532,914

Referencing of Anatomical Structure

Non-Final OA §102
Filed
Dec 07, 2023
Examiner
LETT, THOMAS J
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Novarad Corporation
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
47%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
599 granted / 719 resolved
+21.3% vs TC avg
Minimal -36% lift
Without
With
+-36.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
26 currently pending
Career history
745
Total Applications
across all art units

Statute-Specific Performance

§101
11.1%
-28.9% vs TC avg
§103
27.4%
-12.6% vs TC avg
§102
47.6%
+7.6% vs TC avg
§112
11.6%
-28.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 719 resolved cases

Office Action

§102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-32 and 35 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ryan et al. (US 20200197107 A1). Regarding claim 1, Ryan et al. (US 20200197107 A1) discloses a method for referencing anatomical structure using an AR headset (AR headset 104, figure 1), comprising: registering a marker located on a movable anatomical structure, wherein the marker has a first pose that includes a position and orientation in a 3D coordinate system of the AR headset (algorithms in the AR headset are used to process the images from the stereoscopic cameras (3904) to calculate the point of intersection of each fiducial (1108, 1110, 1112) and thereby determine the six-degrees of freedom pose of the marker 1104. For the purpose of this specification, “pose” is defined as the combination of position and orientation of an object, para. 0181); identifying the marker at a second pose of the movable anatomical structure having a second position and second orientation (distraction paddles 4706 and 4707 are pushed by the spring 4902 and pivot about an anteroposterior axis to provide a nearly equal and constant distraction force between each femoral condyle (4708, 4712) and the tibia 4704. The base element 4702 and distraction paddles (4706, 4704) include optical markers (4714, 4716) which allow the software to measure the degree of distraction of each femoral condyle (4708, 4712), para. 0223); and determining a joint pivot axis and angle by comparing the first pose and the second pose (distraction paddles 4706 and 4707 are pushed by the spring 4902 and pivot about an anteroposterior axis to provide a nearly equal and constant distraction force between each femoral condyle (4708, 4712) and the tibia 4704. The base element 4702 and distraction paddles (4706, 4704) include optical markers (4714, 4716) which allow the software to measure the degree of distraction of each femoral condyle (4708, 4712), para. 0223). Regarding claim 2, Ryan et al. discloses the method as in claim 1, further comprising displaying the joint pivot axis, using the AR headset (algorithms in the AR headset are used to process the images from the stereoscopic cameras (3904) to calculate the point of intersection of each fiducial (1108, 1110, 1112) and thereby determine the six-degrees of freedom pose of the marker 1104, para. 0181). Regarding claim 3, Ryan et al. discloses the method as in claim 1, further comprising: registering a second marker having a third position and third orientation on a proximal anatomical structure at a second pose (system 10 may use the topographical maps of the femur 4204 and tibia 4206 to track the poses of the respective bones (4204, 4206) in lieu of attaching a fiducial marker to the bones (4204, 4206), para. 0214); and computing the first pose and second pose with respect to the proximal anatomical structure using the second marker to enable free movement of the movable anatomical structure and the proximal anatomical structure (bracket 6910 and clamp 6908 fully surround brace 7106 and fit tightly against its sides, top, and bottom to prevent angular movement between the bracket components (6908, 6910) and the brace 7106, para. 0162). Regarding claim 4, Ryan et al. discloses the method as in claim 1, further comprising displaying a plurality of points surrounding the joint pivot axis that are isometric to the joint pivot axis (markers (e.g., 100, 108, 110, etc.) for anatomic landmarks and tools are used for data collection (1000), which may be combined with pre-operative CT scan or MRI data for the determination of position and orientation (1002) of isometric points for ligament reconstruction and surgical tools. Algorithms (1006) are used to determine solutions including, but not limited to, precise localization of tunnel placement and assessment of results, para. 0237). Regarding claim 5, Ryan et al. discloses the method as in claim 4, wherein the plurality of points forms at least one of: an arc (a first three-dimensional arc 4802 represents the medial laxity and a second three-dimensional arc 4804 represents the lateral laxity through the range of motion of the knee, para. 0224), a circle, a cylinder, a curved surface, or an irregular shape. Regarding claim 6, Ryan et al. discloses the method as in claim 1, further comprising displaying a plurality of isometric points surrounding the joint pivot axis based on a defined ligament length to identify locations on a bone where a ligament is affixable at isometric distances to the joint pivot axis, using the AR headset (determination of position and orientation (1002) of isometric points for ligament reconstruction and surgical tools, para. 0237). Regarding claim 7, Ryan et al. discloses the method as in claim 1, measuring a length of a bone from the joint pivot axis to a point defined on the bone (a reference point on the proximal femur to determine the change in leg length and lateral offset from the baseline measurement, para. 0192). Regarding claim 8, Ryan et al. discloses the method as in claim 1, further comprising: calculating a change in the angle between the first pose and the second pose (system 10 can perturb the hip angle to calculate the angular range of motion allowed in each direction prior to impingement between implants, or between implants and bone (5410), para. 0197); and displaying a numerical output for the angle defining an angular change based in part on movement of the marker around the joint pivot axis (positional and angular misalignment relative to the target can also be displayed numerically as virtual text 2718, para. 0199). Regarding claim 9, Ryan et al. discloses the method as in claim 1, further comprising aligning an image data set to a person using the marker attached to the person (at least one visual marker trackable by the camera and fixedly attached to a surgical tool, para. 0006). Regarding claim 10, Ryan et al. discloses the method as in claim 1, further comprising identifying the joint pivot axis as an intersection of a first line that extends down a surface of a bone of a joint substantially parallel to the first pose of the marker and a second line that extends along a surface of the bone of the anatomic structure at the second pose (para. 0180). Regarding claim 11, Ryan et al. discloses the method as in claim 1, wherein the movable anatomical structure and a proximal anatomical structure include bones forming a joint (superior apex of this virtual femur target is placed near the reference point on the moveable and proximal femur, para. 0192). Regarding claim 12, Ryan et al. discloses the method as in claim 1, wherein a proximal anatomical structure includes a bone (para. 0192). Regarding claim 13, Ryan et al. discloses the method as in claim 1, further comprising providing a graphical guide to guide surgical access to the joint pivot axis or a plurality of isometric points on a bone that is related to a joint (virtual guide 6410 is displayed to user 106 in display device 104, indicating the direction in which missing marker 6408 is likely to be found. Virtual guide 6410 may be a symbol, such as an arrow, or text indicating a direction, para. 0192). Regarding claim 14, Ryan et al. discloses the method as in claim 1, wherein the marker is at least one of: an optical code, a 2D bar code, an infrared marker, or a radiopaque marker (a machine-readable serial number code 2902, a human readable serial number 2904, and a set of optical features which facilitate six-degree of freedom optical pose tracking such as a plurality of fiducials 2906. In one embodiment, the machine-readable number code 2902 pattern can be imaged by the camera(s) 3904, para. 0252). Regarding claim 15, Ryan et al. discloses the method as in claim 1, further comprising identifying a proximal anatomical structure, which is fixed in the 3D coordinate system, and the movable anatomical structure is connected to the proximal anatomical structure (surgical helmet 3700 is optionally connected to a surgical hood (not shown) that provides full body coverage for the surgeon 3602, para. 0155). Regarding claim 16, Ryan et al. discloses the method as in claim 15, further comprising determining the joint pivot axis and angle with respect to the proximal anatomical structure by comparing the first pose and the second pose (images can be simultaneously displayed, overlaid, mirrored, or otherwise manipulated to allow the user 106 to make comparisons, para. 0200). Claims 17-32 and 35 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ryan et al. (US 20200197107 A1). Regarding claim 17, Oezbek et al. discloses a method for referencing anatomical structure of a person using an AR headset, comprising: registering a marker on a distal anatomical structure, which is connected with a proximal anatomical structure that does not move during a medical procedure (patient tracker coupled to a patient proximate to a region of interest, paras. 0018, 0034); identifying a first pose for the marker in a 3D (three dimensional) coordinate system and a second pose for the marker (Continuous detection of the position and/or orientation (i.e., pose) of the patient and/or the surgical instrument (so-called navigation data), paras. 0003, 0007, 0033; step of updating the augmented reality position alignment visualization 220 and/or the augmented reality angular alignment visualization 200, para. 0073); calculating a rotational angle of a displacement of the distal anatomical structure using the first pose and second pose (software and/or operating instructions may comprise planning system configured to find an accurate position and/or angular alignment of the surgical instrument 50 in relation to the patient 60., para. 0028); and displaying a graphical reference for the rotational angle of displacement, using the AR headset (augmented reality position alignment visualization 220 comprises two axis-aligned deviation vectors 222, 224 comprising the decomposition of a distance vector from a point on the target trajectory axis 210 to the tip 54 of the surgical instrument 50, or other portion on the surgical instrument 50, para. 0054). Regarding claim 18, Oezbek et al. discloses the method as in claim 17, further comprising: setting a target position and rotation for the marker in the 3D coordinate system (augmented reality visualization may also comprise a target trajectory axis 210. The target trajectory axis 210 may represent a planned or intended surgical path, para. 0050); tracking a positional distance and the rotational angle of a displacement of the distal anatomical structure from a start point using the marker (tracking unit may be configured to continuously track the position and/or orientation (pose) of the head-mounted display, patient tracker, and surgical instrument within a localized or common coordinate system, para. 0007); and displaying the graphical reference for positional distance and rotational angle of displacement as the distal anatomical structure is moved and rotated from the start point (displaying the first angular vector 204 and the second angular vector 202 as lines connected by an arc representative of the deviation angle 206 between the first angular vector 204 and the second angular vector 202, para. 0072). Regarding claim 19, Oezbek et al. discloses the method as in claim 18, further comprising providing at least one of a visual, audible or tactile indicator when the target position and rotation for the marker has been reached (highlighting the visualizations to the user based on the distance of the navigated instrument to each trajectory, para. 0095). Regarding claim 20, Oezbek et al. discloses the method as in claim 17, wherein the distal anatomical structure and the proximal anatomical structure are parts of at least one of: a broken bone, a malrotated structure, a dislocated joint, or dislocated anatomical structure ( if the surgical instrument 50 is misaligned with the direction and orientation corresponding to the first deviation vector 222 of the augmented reality position alignment visualization 220, the first deviation vector 222 may be highlighted on the lens 36 of the head-mounted display 30 to signal to the user that a correction in the alignment is needed based on the target trajectory axis 210, para. 0077). Regarding claim 21, Oezbek et al. discloses a method for referencing anatomical structures for a joint of a person using an AR headset (a surgeon using a first configuration of a surgical navigation system including a head-mounted display and a surgical tracking unit, para. 0016), comprising: registering a first marker on a proximal anatomical structure of the joint and a second marker on a distal anatomical structure of the joint (patient markers 42, figure 2); identifying a first pose position for the second marker in a 3D (three dimensional) space Continuous detection of the position and/or orientation (i.e., pose) of the patient and/or the surgical instrument (so-called navigation data), paras. 0003, 0007, 0033; step of updating the augmented reality position alignment visualization 220 and/or the augmented reality angular alignment visualization 200, para. 0073); setting a target position and rotation for the second marker in the 3D space (target trajectory axis 210 may represent a planned or intended surgical path. For example, the target trajectory axis 210 may represent the optimal or preferred angle or direction for aligning and/or inserting the surgical instrument 50 during execution of the medical procedure, para. 0050); identifying a second pose position for the second marker in a 3D space (determine the position and orientation (pose) of the surgical instrument 50); computing a positional distance and a rotational angle of a displacement of the distal anatomical structure using the first pose position and second pose position (a deviation angle which represents the angle between a first direction vector of the instrument axis and a second direction vector of the target trajectory axis, para. 0011); and displaying the positional distance and rotational angle as the distal anatomical structure is moved between the first pose position and the second pose position, using the AR headset (augmented reality (AR) visualizations that may be displayed in the head-mounted display 30 in registration with the patient 60 and the head-mounted display 30, para. 0032). Regarding claim 22, Oezbek et al. discloses the method as in claim 21, further comprising aligning an image data set to the person using the first marker or the second marker attached to the person (a patient tracker registered to patient data and trackable by said surgical navigation system, para. 0010). Regarding claim 23, Oezbek et al. discloses the method as in claim 21, providing visual, audible, tactile, or graphical feedback when the target position and location have been reached (displayed on the lens 36 of the HMD 30 in a first color, and the augmented reality angular alignment visualization 200 may be displayed on the lens 36 of the HMD 30 in a second color, para. 0065). Regarding claim 24, Oezbek et al. discloses the method as in claim 23, further comprising displaying a graphical icon when the target position or rotation is reached (virtual images overlaid on live features that are illustrated in phantom, paras. 0019, 0020). Regarding claim 25, Oezbek et al. discloses the method as in claim 21, further comprising: segmenting structures of anatomy related to the joint to form segments (an image of a slice of the patient data, such as a two-dimensional image of a specific vertebra, para. 0068); and moving the segments, as viewed through the AR headset, using changes in position and orientation of the first marker or the second marker (FIG. 8 illustrates another alternative configuration of the augmented reality visualization including a virtual image of a portion of the patient 260 and/or patient data as displayed on the lens 36 of the HMD 30, para. 0070). Regarding claim 26, Oezbek et al. discloses the method as in claim 21, wherein the proximal anatomical structure and the distal anatomical structure include a broken portion of a bone (a multiple vertebrae fixation, para. 0025). Regarding claim 27, Oezbek et al. discloses the method as in claim 21, further comprising defining a rotation axis that is a longitudinal axis of a bone in the proximal anatomical structure or the distal anatomical structure (instrument axis 240 may be defined by a line starting at the tip 54 of the surgical instrument 50, para. 0053). Regarding claim 28, Oezbek et al. discloses the method as in claim 21, wherein the first marker and the second marker are at least one of: an optical code, 2D optical code (instrument markers 52 configured to be detectable by the position sensors 14 of the tracking unit 10, para. 0033), an infrared marker, or a radiopaque marker. Regarding claim 29, Oezbek et al. discloses the method as in claim 21, further comprising registering a plurality of points on an optical code (HMD markers 34 (e.g., reflectors) for transmitting light signals (e.g., reflecting light emitted from the tracking unit 10) to the position sensor(s) 14, para. 0041). Regarding claim 30, Oezbek et al. discloses the method as in claim 21, wherein the proximal anatomical structure is fixed in place within the 3D space (a virtual image of a slice of the patient image data shown in a fixed position floating frame 300 above the real surgical site or region of interest 62 as viewed through the lens 36, para. 0068). Regarding claim 31, Oezbek et al. discloses a method for identifying anatomical structures for a ball joint of a person using an AR headset (a surgeon using a first configuration of a surgical navigation system including a head-mounted display and a surgical tracking unit, para. 0016), comprising: registering a marker in a first pose on a distal anatomical structure which is connected with a proximal anatomical structure (patient tracker 40 may comprise an attachment member 44 configured to secure the patient tracker 40 to the patient 60. The attachment member 44 may comprise a clamp, adhesive, strap, threaded fastener, or other similar attachment device. For example, the attachment member 44 may comprise a clamp configured to be secured to the patient 60. This may include utilizing a clamp to secure the patient tracker 40 to a vertebra of the patient 60 proximate to a region of interest 62. This may allow the tracking unit 10 to determine the position and/or orientation of the patient's spine during spinal surgery, para. 0034); registering a second pose of the marker after the distal anatomical structure has been moved with respect to the proximal anatomical structure (Continuous detection of the position and/or orientation (i.e., pose) of the patient and/or the surgical instrument (so-called navigation data) are necessary to provide an accurate spatial representation of the surgical instrument relative to the patient, para. 0003); determining a first joint pivot axis and angle with respect to the proximal anatomical structure by comparing the first pose and the second pose ( deviation angle 206 which represents the angle between a first angular vector 204 representative of an axis offset and in parallel with the instrument axis 240 and a second angular vector 202 representative of the target trajectory axis 210, para. 0060); registering a third pose and fourth pose with the marker to identify a second joint pivot axis (two axis-aligned deviation vectors 222, 224 comprising the decomposition of a distance vector from a point on the target trajectory axis 210 to the tip 54 of the surgical instrument 50, or other portion on the surgical instrument 50. Axis-aligned may refer to the line(s) representing the deviation vector(s) 222, 224 being oriented to be parallel to one of the three major axes of a reference coordinate system, para. 0054); and determining an intersection of a first joint pivot axis and the second joint pivot axis that represents a joint pivot point of the ball joint ( decomposition of the distance vector into the two axis-aligned deviation vectors 222, 224 illustrated in FIG. 4 may be made based on two eigenvectors derived from: the two of the three primary patient axes with the highest angle to the target trajectory axis 210, or the line of sight axis being projected onto the plane perpendicular to the target trajectory axis 210 and attached to the closest point on the target trajectory axis 210 from the tip 54 of the surgical instrument 54, and the perpendicular vector in the same plane to the projected line of sight axis, para. 0058). Regarding claim 32, Oezbek et al. discloses the method as in claim 31, further comprising displaying the joint pivot point using the AR headset (the angle between a first angular vector 204 representative of an axis offset and in parallel with the instrument axis 240 and a second angular vector 202 representative of the target trajectory axis 210, para. 0060). Regarding claim 35, Oezbek et al. discloses the method as in claim 31, wherein the joint pivot point is a ball joint in the person (joint, para. 0003). Claims 36-52 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Tako et al. (US 20210022812 A1). Regarding claim 36, Tako et al. discloses a method for referencing anatomical distance using an augmented reality (AR) headset, comprising: determining at least one boundary of an anatomical structure for a body (the surgeon 102 is able to visualize the boundaries of the tumor, para. 0156) of a person (image includes the scanned medical image (based on scan such as CT, MRI, Ultrasound, X-ray etc.) and the surgery instruments. It may also include real time video and models based on video form microscope or other sources. The SNAP provides a real time 3D interactive guided image for the surgeon. The orientation of the anatomical structures (i.e. head, brain, knee, shoulder etc.) is market and pre-registered both in the physical/patient's and the scanned medical image (CT, MRI, Ultrasound, X-ray etc.), para. 0084), using a medical image data set aligned to the body of the person (the orientation of the scanned medical image and the real anatomical structures of the patient's under the surgery are synchronized and aligned, para. 0084) and an anatomical structure detection service; registering a pointer device that has a position and orientation with respect to the anatomical structure, using the AR headset (saying keyword phrase “Show orientation” to take the user out of the SNAP Case view and give the user an overall look of his position, or pressing a virtual button in the Scene to select surgical tools for virtual use, para. 0130; sensor 2302 that tracks the orientation of the patient's head 2304 relative to the surgeon's position or point of observation 2306 and relative to the tools (microscope, endoscope, probe, etc., para. 0148); generating a virtual line that has an axis aligned with a lengthwise axis of the pointer device and passes through a portion of the anatomical structure (360 model can include additionally infused elements, such as manually added elements, for example, a trajectory path that the surgeon draws to assist in the surgery may be incorporated into the 360 model, para. 0113); and determining a measurement from an entry point where the virtual line enters a boundary of the anatomical structure to an additional point in the anatomical structure (By moving his head, the surgeon 102 is physically viewing the distance between one structure and another, and therefore he can get a real feel of the orientation of the structures and the intra-spherical relations between the structures, para. 0110). Regarding claim 37, Tako et al. discloses the method as in claim 36, further comprising displaying a numerical output for a length of the virtual line between the entry point where the virtual line enters the anatomical structure to the additional point in the anatomical structure (surgeon 102 is physically viewing the distance between one structure and another, para. 0110). Regarding claim 38, Tako et al. discloses the method as in claim 36, further comprising determining a length of a medical device to be used with the anatomical structure based on the length of the virtual line between the entry point where the virtual line enters the anatomical structure to the additional point in the anatomical structure (provides a spherical reference for tracking the surgery instruments and the OR microscope (and/or the surgeon head) and therefore allowing to present the surgery instruments image/model in space in relation to the scanned medical image, para. 0085). Regarding claim 39, Tako et al. discloses the method as in claim 38, wherein the medical device is at least one of an implant, a screw, a needle, a stent or a trocar (implant, para. 0106; surgery instruments, para. 0084). Regarding claim 40, Tako et al. discloses the method as in claim 36, wherein the additional point is a point defined by a user (offset, para. 0161). Regarding claim 41, Tako et al. discloses the method as in claim 36, wherein the additional point is where the virtual line exits from the anatomical structure (surgeon 102 is able to visualize the boundaries of the tumor while he is resecting the meningioma and can see how far he is from vessels or other vital structures, para. 0156). Regarding claim 42, Tako et al. discloses the method as in claim 36, wherein the additional point is a location along the virtual line with a distance offset internal from a location where the virtual line exits the anatomical structure (surgeon 102 is able to visualize the boundaries of the tumor while he is resecting the meningioma and can see how far he is from vessels or other vital structures, para. 0156). Regarding claim 43, Tako et al. discloses the method as in claim 36, further comprising determining boundaries of the anatomical structure using edge detection, feature detection, shape detection, morphometric detection or machine learning (para. 0053). Regarding claim 44, Tako et al. discloses the method as in claim 36, further comprising using an optical code connected to the pointer device to enable the AR headset to register the position and orientation of the pointer device (para. 0068). Regarding claim 45, Tako et al. discloses the method as in claim 36, wherein the virtual line extends from a tip of the pointer device (para. 0069). Regarding claim 46, Tako et al. discloses the method as in claim 36, wherein the proximal anatomical structure is fixed in place within a 3D space (orientation of the anatomical structures (i.e. head, brain, knee, shoulder etc.) is market and pre-registered both in the physical/patient's and the scanned medical image (CT, MRI, Ultrasound, X-ray etc.); therefore, the orientation of the scanned medical image and the real anatomical structures of the patient's under the surgery are synchronized and aligned, para. 0084). Regarding claim 47, Tako et al. discloses a method for virtual repositioning of anatomical structure in an image data set, comprising: aligning the image data set to a person using a marker associated with the person (based on the information retrieved from the patient's own pre-operative scans (upon which the SNAP case has been built), and the alignment/registration process information, para. 0144); determining a joint pivot axis and angle by comparing a first pose and a second pose of movable anatomical structure of the person using the marker ( step of receiving data input indicative of the surgeon's head moving to a second position, wherein the head movement comprises at least one of a change in angle of view and a change in direction of view, para. 0008); segmenting structures of anatomy in the image data set for the person to form anatomical segments in the image data set (para. 0053, para. 0055); and rotating the anatomical segments around the joint pivot axis in the image data set, as viewed, using an AR headset (system allows the surgeon to rotate the simulated image/model that is princely oriented as the real anatomical structure based on the tracking, and observe and evaluate the location and efficacy of the placed implant, para. 0106). Regarding claim 48, Tako et al. discloses the method as in claim 47, further comprising rotating the anatomical segments around the joint pivot axis to represent flexion of the movable anatomical structure in the image data set (process of removing layers (one pre-set thickness at a time) of modeled anatomy perpendicular to the angle of view in order to reveal internal structures “slice by slice”. Layers are removed in the lateral/distal to the medial/proximal direction, para. 0054). Regarding claim 49, Tako et al. discloses the method as in claim 47, further comprises rotating the anatomical segments around the joint pivot axis to an angle match an angle of flexion for a joint detectable by the AR headset (the SNAP computing device 112 matches or co-registers between the real organism, for example the head in neurosurgery, the system 100 sensors, the DICOM scans, and the MD6DM. The registration or “matching” process gives the ability to navigate during the surgery, para. 0154). Regarding claim 50, Tako et al. discloses the method as in claim 47, wherein segmenting structures of anatomy further comprises identifying bones using edge detection, feature detection, shape detection, or machine learning (Surgical Navigation Advanced Platform (SNAP) is intended for use as a software interface and image segmentation system for the transfer of imaging information from CT or MR medical scanner to an output file. A tissue segmentation window is provided to edit and update tissues segmentation to prepare a case. The change in tissue segmentation is reflected in the 3D image, and the result can be saved as part of the case file. It is also intended as both pre and intra-operative software for simulating/evaluating surgical treatment options, para. 0093). Regarding claim 51, Tako et al. discloses the method as in claim 47, further comprising registering a marker located on a movable anatomical structure of a person, wherein the marker has a first pose that includes a position and orientation in a 3D coordinate system of the AR headset (the orientation of the anatomical structures (i.e. head, brain, knee, shoulder etc.) is market and pre-registered both in the physical/patient's and the scanned medical image (CT, MRI, Ultrasound, X-ray etc.); therefore, the orientation of the scanned medical image and the real anatomical structures of the patient's under the surgery are synchronized and aligned, para. 0084; presented to the surgeon with a real time location and orientation of the instruments and markers in space in relation to the anatomical structures, para. 0086). Regarding claim 52, Tako et al. discloses the method as in claim 47, wherein the anatomical segments are moved in a 3D image space of the image data set (paras. 0049, 0055). Allowable Subject Matter Claims 33 and 34 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to THOMAS J LETT whose telephone number is (571)272-7464. The examiner can normally be reached Mon-Fri 9-6 ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Y Poon can be reached at 571-270-0728. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /THOMAS J LETT/Primary Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

Dec 07, 2023
Application Filed
Nov 29, 2025
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602714
LIGHTING AND INTERNET OF THINGS DESIGN USING AUGMENTED REALITY
2y 5m to grant Granted Apr 14, 2026
Patent 12570401
Robot and Unmanned Aerial Vehicle (UAV) Systems for Cell Sites and Towers
2y 5m to grant Granted Mar 10, 2026
Patent 12567217
SMART CONTENT RENDERING ON AUGMENTED REALITY SYSTEMS, METHODS, AND DEVICES
2y 5m to grant Granted Mar 03, 2026
Patent 12561867
SYSTEMS AND METHODS FOR AUTOMATICALLY ADDING TEXT CONTENT TO GENERATED IMAGES
2y 5m to grant Granted Feb 24, 2026
Patent 12555276
Image Generation Method and Apparatus
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
47%
With Interview (-36.0%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 719 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month