Prosecution Insights
Last updated: April 19, 2026
Application No. 18/674,102

AUGMENTED REALITY SYSTEM WITH IMPROVED REGISTRATION METHODS AND METHODS FOR MULTI-THERAPEUTIC DELIVERIES

Non-Final OA §102§103§112
Filed
May 24, 2024
Examiner
BARHAM, RYAN ALLEN
Art Unit
2613
Tech Center
2600 — Communications
Assignee
Mediview Xr Inc.
OA Round
1 (Non-Final)
54%
Grant Probability
Moderate
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
7 granted / 13 resolved
-8.2% vs TC avg
Strong +60% interview lift
Without
With
+60.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
19 currently pending
Career history
32
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
48.2%
+8.2% vs TC avg
§102
45.4%
+5.4% vs TC avg
§112
2.8%
-37.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 13 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 08/05/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. The information disclosure statement (IDS) submitted on 10/31/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claim 9 is objected to because of the following informality: line 2: “robotic” should read “robot”. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 14-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Note, claim 15 is depending on claim 12. The term “3D module” in claims 14 and 15 is a relative term which renders the claim indefinite. The term “3D module” is not defined by the claims, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. For purposes of applying prior art, the Examiner interprets claim 14 as reading as follows: 14. The method of Claim 1, further including a step of registering a 3D model of the imaging volume dataset. Claim 15 recites, in part, "the step of registering" (emphasis added). However, claims 1, 12, and 15 do not recite a "step of registering". Thus, claim 15 is indefinite because "the step of registering" lacks antecedent basis. For purposes of applying prior art, the Examiner interprets claim 15 as depending from claim 14, which recites, in part, "a step of registering". The Examiner interprets claim 15 as reading as follows: 15. The method of Claim 14, wherein the step of registering a 3D model of the imaging volume dataset includes a step of fabricating a physical 3D model. Claim 16 recites, in part, "the physical 3D model" (emphasis added). However, claims 1, 13, and 16 do not recite a "physical 3D model". Thus, claim 15 is indefinite because "the physical 3D model" lacks antecedent basis. For purposes of applying prior art, the Examiner interprets claim 16 as depending from claim 15, which recites, in part, "a physical 3D model". Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 5, 12-17, and 19-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Poltaretskyi (US 20210161613 A1). Regarding claim 1, Poltaretskyi teaches a method for registration of anatomy of a patient using augmented reality for a procedure on a patient (par. 0304: “In some examples, MR system 212 may use one of virtual markers or physical markers as a primary registration marker and use the other as a secondary, or supplemental, registration marker. As one example, MR system 212 may begin a registration process by attempting to perform registration using the primary registration marker.” NOTE: “MR” in this passage refers to “Mixed Reality,” which is considered to include Augmented Reality, as stated elsewhere in Poltaretskyi; par. 0167: “For purposes of this disclosure, MR is considered to include AR.”), comprising: providing a system including: an augmented reality system configured to display an augmented representation of an anatomical feature of the patient in an augmented reality (par. 0184: “MR system 212 may include a visualization device 213 that may be worn by the surgeon and (as will be explained in further detail below) is operable to display a variety of types of information, including a 3D virtual image of the patient's diseased, damaged, or postsurgical joint”); an imaging system configured to image the anatomical feature of the patient and generate an imaging dataset without the use of fiducial sensors (par. 0194: “The medical images generated during the image acquisition step include images of an anatomy of interest of the patient. For instance, if the patient's symptoms involve the patient's shoulder, medical images of the patient's shoulder may be generated.”); a tracked instrument configured to be used during the procedure and to generate a tracked instrument dataset (par. 0209: “in some examples, the sensor data can be used to recognize surgical instruments and the position and/or location of those instruments.”); and a computer system in communication with the augmented reality system, the imaging system, and the tracked instrument (par. 0826: “In the example of FIG. 122, computing system 12202 includes one or more processing circuits 12206, a data storage system 12208, and a set of one or more communication interfaces 12210A through 12210N (collectively, “communication interfaces 12210”).”); positioning the tracked instrument in a first axial plane and collecting, from the tracked instrument, the tracked instrument dataset (par. 0184: “MR system 212 may include a visualization device 213 that may be worn by the surgeon and (as will be explained in further detail below) is operable to display a variety of types of information, including… …details of the surgical plan, such as a 3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint, and any other information that may be useful to the surgeon to implement the surgical plan.”); positioning the imaging system in a second axial plane and collecting, from the imaging system, the imaging dataset (par. 0357: “The visualization system (e.g., MR system 212/visualization device 213) may be configured to display different types of virtual guides. Examples of virtual guides include, but are not limited to, a virtual point, a virtual axis, a virtual angle, a virtual path, a virtual plane, and a virtual surface or contour.”); identifying an internal landmark of the patient with both the tracked instrument and the imaging system (par. 0293: “As illustrated in FIG. 30E, electromagnetic (EM) tracking system 3022 (which may be included within MR system 204 of FIG. 2) includes electromagnetic (EM) tracker 3024, field generator 3004, and one or more EM physical markers 3028A and 3028B (collectively “EM physical markers 3028”). EM physical markers 3006 may be positioned near and/or attached to the object to be tracked (e.g., observed bone structure 2200, an instrument, or the like) using the techniques described above.”); aligning the first axial plane and the second axial plane based on the internal landmark identified (par. 0944: “MR system 212 may display one or both of a virtual marker that identifies a prescribed plane of the cutting (e.g., as discussed above with reference to FIGS. 36A-36D) and/or an indication of whether the saw blade is aligned with the prescribed plane.”); generating, by the computer system, an imaging volume dataset based on the first axial plane and the second axial plane in 3D coordinates (par. 0163: “A surgical plan, e.g., as generated by the BLUEPRINT™ system or another surgical planning platform, may include information defining a variety of features of a surgical procedure, such as features of particular surgical procedure steps to be performed on a patient by a surgeon according to the surgical plan including, for example, bone or tissue preparation steps and/or steps for selection, modification and/or placement of implant components. Such information may include, in various examples, dimensions, shapes, angles, surface contours, and/or orientations of implant components to be selected or modified by surgeons, dimensions, shapes, angles, surface contours and/or orientations to be defined in bone or tissue by the surgeon in bone or tissue preparation steps, and/or positions, axes, planes, angle and/or entry points defining placement of implant components by the surgeon relative to patient bone or tissue. Information such as dimensions, shapes, angles, surface contours, and/or orientations of anatomical features of the patient may be derived from imaging (e.g., x-ray, CT, MRI, ultrasound or other images), direct observation, or other techniques.”); registering, by the computer system, the imaging volume dataset with the augmented reality system (par. 0304: “As one example, MR system 212 may begin a registration process by attempting to perform registration using the primary registration marker.”); rendering the imaging volume dataset into an imaging volume hologram (par. 0498: “MR system 212 may render the illustration of the location of the portion of the anatomy and the location of the tool relative to the portion of the anatomy may from any view (e.g., top view, left side view, right side view, above view, below view, etc.).”); and projecting, by the augmented reality system, the imaging volume hologram (par. 0550: “The virtual information may be projected as holographic imagery on the screen of an MR visualization device, such as visualization device 213, e.g., via holographic lenses, that a holographic image is overlaid on a real-world surgical item visible through the screen.”). Regarding claim 5, Poltaretskyi teaches the method of Claim 1, further including steps of: locating sagittal midline on the patient (par. 0490: “relative to the patient, the images presented in secondary view window 6704 may be in a sagittal axis.”); and rotating the imaging volume hologram such that a holographic sagittal midline of the imaging volume hologram and sagittal midline on the patient are congruent (par. 1129: “ in some cases, manipulations by VR/MR surgical help device 14508 on a virtual model of patient anatomy in space may appear to MR surgical device 14504 as virtual manipulations on a registered virtual model that is registered to the patient anatomy in the operating room. For example, the user of VR/MR surgical help device 14508 may demonstrate a desired location of a virtual reaming axis relative to a virtual model in space, and this demonstration may appear to MR surgical device 14504 as a virtual reaming axis that is properly positioned relative to patient anatomy since the virtual model is registered to the patient anatomy when viewed by MR surgical device 14504.”). Regarding claim 7, Poltaretskyi teaches the method of Claim 1, wherein the tracked instrument dataset includes an electromagnetic dataset (par. 0292: “Electromagnetic tracking (i.e., tracking using electromagnetic physical markers, referred to as “EM tracking”) may be accomplished by positioning sensors within a magnetic field of known geometry, which may be created by a field generator (FG). The sensors may measure magnetic flux or magnetic fields. A tracking device may control the FG and receive measurements from the sensors. Based on the received measurements, the tracking device may determine the locations/positions of the sensors.”). Regarding claim 12, Poltaretskyi teaches the method of Claim 1, wherein the augmented reality system includes a head down display for projecting the imaging volume hologram (par. 0571: “a surgeon may perform the surgery while wearing a head-mounted MR visualization device of intraoperative system 108 that presents guidance information to the surgeon.” NOTE: This head-mounted display could be interpreted as a head down display when the user is turning their head downward.). Regarding claim 13, Poltaretskyi teaches the method of claim 1, further including steps of: providing a predetermined ablation zone (par. 0363: “As shown in FIGS. 42A-49, MR system 212 may provide virtual guidance to assist a surgeon in humeral preparation, such as cutting to remove all or a portion of the humeral head.” NOTE: a “predetermined ablation zone” is understood here to mean “an area selected in advance of a medical procedure involving tissue removal”); and correcting the predetermined ablation zone based on the imaging volume hologram (par. 0379: “MR system 212 may display one or both of a virtual marker that identifies a center point or prescribed axis of the reaming (e.g., as discussed above with reference to FIGS. 36A-36D) and/or an indication of whether graft reaming tool 3700 is aligned with the prescribed axis.”). Regarding claim 14, Poltaretskyi teaches the method of Claim 1, further including a step of registering a 3D module of the imaging volume dataset (par. 0338: “For a shoulder arthroplasty application, the registration process may start by virtualization device 213 presenting the user with 3D virtual bone model 1008 of the patient's scapula and glenoid that was generated from preoperative images of the patient's anatomy, e.g., by surgical planning system 102.” NOTE: see Claim 14 112(b) rejection above). Regarding claim 15, Poltaretskyi teaches the method of Claim 12, wherein the step of registering a 3D module of the imaging volume dataset includes a step of fabricating a physical 3D model (par. 1006: “MR teacher device 12702 and MR student device 12704 may be configured to perform a registration process for registering a virtual model to an actual anatomical feature, e.g., registering a virtual model of a glenoid to an actual glenoid or a physical model, such as from a cadaver or a synthetic bone model, respectively.” NOTE: see Claim 15 112(b) rejection above). Regarding claim 16, Poltaretskyi teaches the method of Claim 13, wherein the physical 3D model includes an image target location for registering the physical 3D model into augmented reality coordinates (par. 1000: “MR educational content 12706 may comprise virtual markers relative to a 3D virtual representation of one or more anatomical features (e.g., a virtual model) or relative to a physical model or cadaver anatomy. The virtual markers may specify locations, points or axes for drilling, reaming, grinding, preparation for an implant, attachment of an implant, or anything that might be shown for intra-operative surgical guidance with respect to anatomy of a patient.”). Claim 17 is functionally identical to claim 1, as the method of claim 1 teaches providing a system that includes all the elements of the system of claim 17. The only substantial difference is that claim 17 specifies that the computer system in both claims is configured to carry out the additional elements of the method of claim 1. Therefore, claim 17 is rejected on similar grounds as claim 1. Regarding claim 19, Poltareskyi teaches the system of Claim 17, wherein the imaging system includes at least one of a computed tomography (CT) system (par. 0194: “the images may be generated using a Computed Tomography (CT) process, a Magnetic Resonance Imaging (MRI) process, an ultrasound process, or another imaging process.”), a fluoroscopy system (par. 0949: “In some examples, the surgeon may utilize fluoroscopy to perform the tibial tray trialing.”), positron emission computed tomography (par. 0194, as above), magnetic resonance imaging (MRI) system (par. 0194, as above), and an ultrasound (US) system (par. 0194, as above). Regarding claim 20, Poltareskyi teaches the system of Claim 17, wherein the augmented reality system includes a head down display (par. 0200: “a surgeon may perform the surgery while wearing a head-mounted MR visualization device of intraoperative system 108 that presents guidance information to the surgeon.” NOTE: This head-mounted display could be interpreted as a head down display when the user is turning their head downward.). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Poltaretskyi (US 20210161613 A1) as applied to claim 1 above, and further in view of Kovtun (US 11304759 B2). Regarding claim 2, Poltaretskyi teaches the method of claim 1. Poltaretskyi teaches planar alignment (par. 0908: “The placement of the cutting guide is then refined by adjusting three angles relative to the three anatomical planes (axial, sagittal and coronal).”), but fails to specify wherein the first axial plane and the second axial plane are aligned to be orthogonal. Kovtun teaches that orthogonal planar alignment is known in the art (col. 29, lines 9-12: “Conventional viewers often only allow planes along the cardinal orthogonal axes of the array (e.g., the axial, coronal and sagittal planes of the subject) to be viewed.”). It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to align the first and second axial plane in an orthogonal fashion, as both Poltaretskyi and Kovtun are in the same field of endeavor of presenting medical imaging data in an extended reality environment. Doing so is well-known in the art and commonly practiced. Regarding claim 3, Poltaretskyi teaches the method of claim 1. Poltaretskyi teaches planar alignment (par. 0908: “The placement of the cutting guide is then refined by adjusting three angles relative to the three anatomical planes (axial, sagittal and coronal).”), but fails to specify wherein the first axial plane and the second axial plane are aligned to be oblique. Kovtun teaches oblique planar alignment (col. 27, lines 54-65: “process 600 can perform a trilinear interpolation that determines the value based on eight voxels that form the corners of a cube centered on the current location of the ray. Note that, because the clip object can intersect the 3D array at an oblique angle (e.g., not along the axial, coronal, or sagittal plane), there may not be a voxel value corresponding to the point at the current location of the ray. Interpolation of the value can provide an approximation of the value at that point (or a relatively precise value when the ray is co-located with the position of the voxel) without causing a pixelated appearance by relying on merely using the value of the nearest voxel.”). It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to align the first and second axial plane in an oblique fashion, as both Poltaretskyi and Kovtun are in the same field of endeavor of presenting medical imaging data in an extended reality environment. It would obviously be necessary to fine-tune a surgical procedure along more planes than only an orthogonal one, as every human body is different and requires a different approach to facilitate a successful operation. Regarding claim 4, Poltaretskyi teaches the method of claim 1. Poltaretskyi teaches planar alignment (par. 0908: “The placement of the cutting guide is then refined by adjusting three angles relative to the three anatomical planes (axial, sagittal and coronal).”), but fails to specify wherein the first axial plane and the second axial plane are aligned to be coplanar. Kovtun teaches that coplanar alignment is known in the art (col. 14, lines 16-22: “Because the best approach (e.g., as determined by the user) may be non-coplanar with any of the axial, sagittal, and coronal planes, conventional medical imaging techniques that only allow viewing along these planes may not allow the user to discover a path that can be discovered in a virtual environment.”). It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to align the first and second axial plane in a coplanar fashion, as both Poltaretskyi and Kovtun are in the same field of endeavor of presenting medical imaging data in an extended reality environment. Doing so is well-known in the art and commonly practiced. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Poltaretskyi (US 20210161613 A1) as applied to claim 1 above, and further in view of Maier-Hein (US 9498132 B2). Regarding claim 6, Poltaretskyi teaches the method of Claim 1, further including segmenting a skin surface image from the imaging dataset, thereby creating skin surface imaging data (par. 1001: “The virtual tissue model may be segmented, which may allow for different layers (skin, fat, muscle and bone) of the virtual tissue model to be shown, exposed and manipulated by students wearing MR student device 12704 or by a teacher wearing MR teacher device 12702.”); Poltaretskyi fails to teach aligning, by a tracking sensor, the skin surface image data to the patient; and registering the skin surface image data to the imaging dataset and the tracked instrument dataset. Maier-Hein teaches segmenting a skin surface image from the imaging dataset, thereby creating skin surface imaging data (col. 14, lines 23-26: “The visualization system 10 further comprises a visualization means equipped with a graphic processing unit and with medical image processing software to segment the skin as well as all other structures of interest.”); aligning, by a tracking sensor, the skin surface image data to the patient (col. 20, lines 39-41: “Guiding may be facilitated if the sensor means track the instrument by sensing a distance of the sensor means to said instrument.”); and registering the skin surface image data to the imaging dataset and the tracked instrument dataset (col. 20, lines 7-15: “To register the 3D-planning image to the patient, the 3D-range camera may be used to acquire a surface representing the patient's skin above the target region. This surface information is matched to the corresponding surface extracted from the 3D medical image data, as described above. This registration yields the pose of critical structures and other relevant planning data, such as the needle trajectory, relative to the intra-interventionally acquired surface.”). It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to incorporate tracking sensor alignment and image dataset registration into Poltaretskyi’s skin surface segmentation, as both are techniques well-known in the art and mentioned in other parts of Poltaretskyi. It is a simple matter to incorporate techniques Poltaretskyi was already using into another part of the same invention. Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Poltaretskyi (US 20210161613 A1) as applied to claim 1 above, and further in view of Ida (US 20210315637 A1). Regarding claim 8, Poltaretskyi teaches the method of claim 1, further including: aligning a robotic image target within 3D cartesian coordinates (par. 0635: “motion tracking device 10906 may record angles for maximum points relative to particular planes, polar coordinates of the maximum points, spherical coordinates of the maximum points, Cartesian coordinates of the maximum points, or other types of data to indicate the maximum points.”); and localizing the robotic image target with augmented reality system coordinates (par. 0447: “MR system 212 may obtain, from the virtual surgical plan, coordinates on the virtual model of the prosthesis and vector for each of the fasteners.”). Poltaretskyi fails to teach registering a surgical robot for the procedure, including attaching a robotic image target to a portion of the surgical robot at a predetermined position and a predetermined orientation. Ida teaches registering a surgical robot for the procedure, including: attaching a robotic image target to a portion of the surgical robot at a predetermined position and a predetermined orientation (par. 0007: “The one or more processors are configured to plan a position of a port to be perforated on a body surface of a subject which is a target of the robotic surgery, acquire a captured image obtained by capturing the subject including at least a part of the subject by an overview camera included in the robot main body, recognize a planned position of the port in the captured image based on the captured image and the planned position of the port, and show the captured image and port position information indicating the planned position of the port in the subject illustrated in the captured image, on a display unit.”). It would have been obvious to one familiar in the art prior to the effective filing date of the claimed invention to register a surgical robot for an augmented reality procedure. While Poltaretskyi does not explicitly teach registering a surgical robot, it does consider broadcasting a surgical operation to a remote viewer via an MR device (FIG. 115 and 116; par. 0722: “a local surgeon using MR may be guided and aided by the expertise of a remote physician using VR.”; par. 1119: “the surgical expert may join the procedure occurring in the operating room (either physically or remotely) via VR/MR surgical help device 14508.”), which could easily be used to facilitate a remotely-operated surgical robot for anyone who desired to use Poltaretskyi’s device to allow a surgeon to intervene more directly over a great distance. Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Poltaretskyi (US 20210161613 A1) as applied to claim 17 above, and further in view of Petkov (US 20230255692 A1). Regarding claim 18, Poltaretskyi teaches the system of claim 17, but fails to teach any particular instrument being tracked. Petkov teaches wherein the tracked instrument includes at least one of a needle and an electromagnetically tracked ultrasound (par. 0012: “the received data indicative of the anatomical structure may be received during the surgical procedure, e.g., from a camera attached to a surgical needle, from a gyroscopic sensor at a handle of a surgical device (also: surgical instrument), from ultrasound images, and/or from an external tracking system, e.g., optical tracking (which may also be denoted as reflective tracking) and/or from electromagnetic tracking.”). It would have been obvious to one familiar in the art prior to the effective filing date to utilize Poltaretskyi’s instrument tracking to monitor the position and location of a needle or an ultrasound in operations that warranted such devices. Poltaretskyi’s device is intended to be used specifically for ankle surgery procedures; if one familiar in the art intended to adapt the technology towards other forms of surgery, tracking a needle or an ultrasound would be a necessary and obvious innovation. Allowable Subject Matter Claims 9-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RYAN A BARHAM whose telephone number is (571)272-4338. The examiner can normally be reached Mon-Fri, 8:30am-5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xiao Wu, can be reached at (571) 272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RYAN ALLEN BARHAM/Examiner, Art Unit 2613 /XIAO M WU/Supervisory Patent Examiner, Art Unit 2613
Read full office action

Prosecution Timeline

May 24, 2024
Application Filed
Jan 12, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12564345
MEDICAL APPARATUS, AND IMAGE GENERATION METHOD FOR VISUALIZING TEMPORAL TRENDS OF BIOMAGNETIC DATA ON AN ORGAN MODEL
2y 5m to grant Granted Mar 03, 2026
Patent 12548109
Preserving Tumor Volumes for Unsupervised Medical Image Registration
2y 5m to grant Granted Feb 10, 2026
Patent 12530836
OBJECT TRANSITION BETWEEN DEVICE-WORLD-LOCKED AND PHYSICAL-WORLD-LOCKED
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+60.0%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 13 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month