Prosecution Insights
Last updated: April 19, 2026
Application No. 18/313,612

SYNCHRONIZED PLACEMENT OF SURGICAL IMPLANT HARDWARE

Non-Final OA §102§103
Filed
May 08, 2023
Examiner
TRAN, JENNY NGAN
Art Unit
2615
Tech Center
2600 — Communications
Assignee
Ix Innovation LLC
OA Round
3 (Non-Final)
20%
Grant Probability
At Risk
3-4
OA Rounds
2y 6m
To Grant
70%
With Interview

Examiner Intelligence

Grants only 20% of cases
20%
Career Allow Rate
1 granted / 5 resolved
-42.0% vs TC avg
Strong +50% interview lift
Without
With
+50.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
31 currently pending
Career history
36
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
49.0%
+9.0% vs TC avg
§102
21.8%
-18.2% vs TC avg
§112
18.3%
-21.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 5 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of the Claims Claims 1-20 are currently pending in the present application, with claim 1, 8, and 15 being independent. Response to Arguments Applicant's arguments filed 2/10/2026 have been fully considered but they are not persuasive. Applicant argues: Poltaretskyi et al. (WO 2019245864) does not disclose “assembling the first surgical implant component and the second surgical implant component within the anatomy to form the surgical implant based on the implantation plan” Examiner replies: Poltaretskyi expressly discloses an XR/MR surgical planning environment, including a digital anatomical model and implant component planning using patient-specific 3D anatomy models (Par. 0162-0162, Par. 0169, Par. 0172, Par. 0188), further discloses a multi-component virtual implant model including, for example, a ball component, cup component, and humeral stem component, generated by selecting and arranging virtual implant-component objects (Par. 0244), and further discloses simulation of the implanted components relative to the anatomy, including testing selection, placement, and positioning/moving components of the 3D virtual model and 3D virtual implant model to evaluate collisions/impingement for a given virtual plan (Par. 0237-0245). Poltaretskyi also expressly teaches projecting the surgical plan on the real anatomy, including entry points for positioning a prosthetic implant (Par. 0322-0324), and actual intraoperative installation of glenoid and humeral implant components within the patient to form the prosthetic joint (Par. 0330-0333). Under broadest reasonable interpretation, the recitation of “assembling the first surgical implant component and the second surgical implant component within the anatomy to form the surgical implant based on the implantation plan” encompasses positioning and installing multiple implant components within the patient anatomy according to the surgical plan. Such that the components together form the implanted prosthetic system. In addition, the recitation of “in-vivo assembly” in claim 5 is worded broadly enough that as long as a plurality of simulations are performed and the implantation plan is generated based on those simulations, that encompasses the limitations of claim 5 as currently worded. The claim language does not require any separately recited mechanical coupling or simultaneous joining step inside the body, and when considered as a whole, Poltaretskyi teaches the claimed limitations of “assembling the first surgical implant component and the second surgical implant component within the anatomy to form the surgical implant based on the implantation plan” in claim 1, and “in-vivo assembly” in claim 5, and therefore claim 1 and 5 remains anticipated. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-2, 4-9, 11-16, and 18-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Poltaretskyi et. al. (WO 2019245864), hereinafter referred to as “Poltaretskyi”. Regarding claim 1, Poltaretskyi discloses an extended-reality (XR) computer-implemented method comprising: generating an XR environment comprising a digital anatomical model representing anatomical features (Par. 0169; XR…Par. 0172; Visualization tools are available that utilize patient image data to generate three-dimensional models of bone contours…Par.0188; user can manipulate the user interface…to request and view details of the surgical plan for the particular patient, including a 3D virtual model of the anatomy of interest…and/or a 3D model of the prosthetic component selected to repair an anatomy of interest…Par. 0322; MR system 212 may present 3D virtual objects such that the objects appear to reside within a real environment, e.g., with real anatomy of a patient…virtual images of the surgical plan may include one or more of the 3D virtual model of the anatomy of interest, a 3D model of the prosthetic component selected to repair the anatomy of interest…), the XR environment configured to enable virtual positioning and assembly of digital models of surgical implants in the digital anatomical model (Par. 0163; A surgical plan, e.g., as generated by the BLUEPRINT™ system or another surgical planning platform, may include information defining a variety of features of a surgical procedure, such as features of particular surgical procedure steps to be performed on a patient by a surgeon according to the surgical plan including, for example, bone or tissue preparation steps and/or steps for selection, modification and/or placement of implant components. Such information may include, in various examples, dimensions, shapes, angles, surface contours, and/or orientations of implant components to be selected or modified by surgeons, dimensions, shapes, angles, surface contours and/or orientations to be defined in bone or tissue by the surgeon in bone or tissue preparation steps, and/or positions, axes, planes, angle and/or entry points defining placement of implant components by the surgeon relative to patient bone or tissue…Par. 0172; The surgeon can use the BLUEPRINT™ system to select, design or modify appropriate implant components, determine how best to position and orient the implant components and how to shape the surface of the bone to receive the components, and design, select or modify surgical guide tool(s) or instruments to carry out the surgical plan…Par. 0184; 3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint…Par. 0188; 3D virtual model of the anatomy of interest (e.g., a 3D virtual bone model of the anatomy of interest, such as a glenoid bone or a humeral bone) and/or a 3D model of the prosthetic component selected to repair an anatomy of interest…Par. 0235-236; 3D virtual bone model 1008 and the 3D model of the implant components 1010), virtually positioning (Par. 0184; 3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint…Par. 0244; the computing system may also generate, based on a set of surgical parameters, a 3D virtual implant model for the joint (1110)…The set of surgical parameters may indicate sizes, shapes, positions, or other aspects of components of one or more components of an implant for the joint. Generating the 3D virtual implant model may comprise selecting and arranging virtual objects that correspond to components of the implant that have the sizes indicated by the set of surgical parameters) a first surgical implant component model and a second surgical implant component model in the digital anatomical model (Par. 0188; 3D virtual model of the anatomy of interest (e.g., a 3D virtual bone model of the anatomy of interest, such as a glenoid bone or a humeral bone) and/or a 3D model of the prosthetic component selected to repair an anatomy of interest…Par. 0235-236; 3D virtual bone model 1008 and the 3D model of the implant components 1010. Par. 0243-0244; In an example where the joint is a shoulder joint, the 3D virtual implant model may include a ball component, a cup component, and a humeral stem component, as shown in FIG. 11 A. Virtual implant models for other joints or other surgeries on the shoulder joint may include different components), generating an implantation plan (Par. 0162-0163; mixed reality (MR) visualization system to assist with creation, implementation, verification, and/or modification of a surgical plan before and during a surgical procedure… A surgical plan, e.g., as generated by the BLUEPRINT™ system or another surgical planning platform, may include information defining a variety of features of a surgical procedure, such as features of particular surgical procedure steps to be performed on a patient by a surgeon according to the surgical plan including. Par. 0184; MR system 212 can be used by a surgeon before (e.g., preoperatively) or during the surgical procedure (e.g., intraoperatively) to create, review, verify, update, modify and/or implement a surgical plan) for assembling a first surgical implant component corresponding to the first surgical implant component model and a second surgical implant component corresponding to the second surgical implant component model to form a surgical implant within an anatomy of a patient based on the virtual positioning (Par. 0162-0163; … for example, bone or tissue preparation steps and/or steps for selection, modification and/or placement of implant components. Such information may include, in various examples, dimensions, shapes, angles, surface contours, and/or orientations of implant components to be selected or modified by surgeons, dimensions, shapes, angles, surface contours and/or orientations to be defined in bone or tissue by the surgeon in bone or tissue preparation steps, and/or positions, axes, planes, angle and/or entry points defining placement of implant components by the surgeon relative to patient bone or tissue. Par. 0172; The surgeon can use the BLUEPRINT™ system to select, design or modify appropriate implant components, determine how best to position and orient the implant components and how to shape the surface of the bone to receive the components, and design, select or modify surgical guide tool(s) or instruments to carry out the surgical plan. The information generated by the BLUEPRINT™ system is compiled in a preoperative surgical plan for the patient that is stored in a database…including before and during the actual surgery. Par. 0184; …3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint, and any other information that may be useful to the surgeon to implement the surgical plan. MR system 212 can generate images of this information that are perceptible to the user of the visualization device 213 before and/or during the surgical procedure. Par. 0244; a 3D virtual implant model for the joint (1110). In an example where the joint is a shoulder joint, the 3D virtual implant model may include a ball component, a cup component, and a humeral stem component, as shown in FIG. 11 A. Virtual implant models for other joints or other surgeries on the shoulder joint may include different components…Par. 0322-0324; MR system 212 may output, for viewing by a user, virtual images of the virtual surgical plan projected within a real environment, where the virtual images of the virtual surgical plan including the 3D virtual model of the anatomy of interest. In this example, the virtual images of the surgical plan may include one or more of the 3D virtual model of the anatomy of interest, a 3D model of the prosthetic component selected to repair the anatomy of interest, and virtual images of a surgical workflow to repair the anatomy of interest…), performing one or more virtual simulations of moving the first surgical implant component into the anatomy of the patient, moving the second surgical implant component into the anatomy of the patient (Par. 0237-0240, and Fig. 11A; selection of range-of-motion icon 1102 on navigation bar 1012 of the "Planning" page launches a range-of-motion in which the user can test or confirm the selection, placement and/or positioning of the implant components 1010 by simulating various different motions of the anatomy with the prosthetic implant implanted…MR system 212 may present an animation of the humerus of 3D virtual model 1008 moving in each of the movement types listed in range-of-motion menu 1104…the impingement angles represent angles determined for a given virtual surgical plan, including implant components, along with component size, position and angle, specified by the virtual plan…Visualization of the simulated ranges of motion using MR system 212…Par. 0245; the computing system may determine the impingement angles by moving components of the 3D virtual bone model and 3D virtual implant model and detecting where collisions between virtual objects in the 3D virtual bone model and 3D virtual implant model occur. Par. 0322-0324; The steps of the virtual surgical plan projected on the real anatomy of interest include identification of an entry point for positioning a prosthetic implant to repair the real anatomical feature of interest…), and assembling the first surgical implant component and the second surgical implant component (Par. 0244; a 3D virtual implant model for the joint (1110). In an example where the joint is a shoulder joint, the 3D virtual implant model may include a ball component, a cup component, and a humeral stem component, as shown in FIG. 11 A. Virtual implant models for other joints or other surgeries on the shoulder joint may include different components…Par. 0330; glenoid implant…humeral implant…Par. 0331; glenoid implant…screws. Par. 0332-0333; humerus implant…glenoid implant) within the anatomy to form the surgical implant based on the implantation plan (Par. 0184; MR system 212 can be used by a surgeon before (e.g., preoperatively) or during the surgical procedure (e.g., intraoperatively) to create, review, verify, update, modify and/or implement a surgical plan…prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components…Par. 0330; In this reverse shoulder arthroplasty, the surgeon may install a humeral implant that has a concave surface that slides over the convex surface of the glenoid implant. As in the other steps of the shoulder surgery of FIG. 19, an MR system (e.g., MR system 212, MR system 1800A, etc.) may present virtual guidance to help the surgeon perform the glenoid installation process…Par. 0331; the glenoid implantation process includes a process to fix the glenoid implant to the patient’s scapula (1914)…includes drilling one or more anchor holes or one or more screw holes into the patient’s scapula and positioning an anchor such as one or more pegs or a keel of the implant in the anchor hole(s) and/or inserting screws through the glenoid implant and the screw holes…An MR system (e.g., MR system 212, MR system 1800A, etc.) may present virtual guidance to help the surgeon with the process of fixing the glenoid implant the glenoid bone, e.g., including virtual guidance indicating anchor or screw holes to be drilled or otherwise formed in the glenoid, and the placement of anchors or screws in the holes…Par. 0332-0333; In instances where the surgeon is performing an anatomical shoulder arthroplasty, the humerus implant may have a convex surface that acts as a replacement for the patient’s natural humeral head. The convex surface of the humerus implant slides within the concave surface of the glenoid implant. In instances where the surgeon is performing a reverse shoulder arthroplasty, the humerus implant may have a concave surface and the glenoid implant has a corresponding convex surface…), wherein the surgical implant remains within the patient after installation (Par. 0162; Orthopedic surgery can involve implanting one or more prosthetic devices to repair or replace a patient's damaged or diseased joint. Par. 0330-0333. Par. 0324; The steps of the virtual surgical plan projected on the real anatomy of interest include identification of an entry point for positioning a prosthetic implant to repair the real anatomical feature of interest), wherein the assembling is performed using an XR device displaying an augmented-reality (AR) environment mapped to the anatomical features (Par. 0169; XR…Par. 0246; A MR visualization device, such as visualization device 213 (FIG. 2), may present a MR visualization that includes the 3D virtual bone model, the 3D virtual implant model, and visual elements indicating a plurality of impingement angles (1114)), after virtual installation of the surgical implant within the patient, performing a comparison of the virtually installed surgical implant within the patient to the implantation plan (Par. 0237-0242, Fig. 11A; “Planning” page launches a range-of-motion mode in which the user can test or confirm the selection, placement and/or positioning of the implant components 1010 by simulating various different motions of the anatomy with the prosthetic implant implanted according to the preoperative surgical plan for the patient…The impingement angles represent angles determined for a given virtual surgical plan, including implant components, along with component size, position and angle, specified by the virtual surgical plan), and updating the implantation plan based on the comparison (Par. 0240-0242; Visualization of the simulated ranges of motion using MR system 212 can help the surgeon confirm the surgical plan or may lead the surgeon to update or modify the preoperative surgical plan…If a bony impingement (i.e., a collision) occurs at an angle within the normal range of motion for a patient, this may indicate to the surgeon that a change in certain parameters of the surgical plan (e.g., size, type, position or orientation of implant components) may be needed…). Regarding claim 2, Poltaretskyi discloses the method of claim 1, and further discloses comprising performing confidence-score AR mapping of the AR environment to the anatomical features to meet a confidence threshold for assembling the first and second surgical implant components (Par. 0309-0311; confidence distance). Regarding claim 4, Poltaretskyi discloses the method of claim 1, and further discloses comprising repeatedly simulating the virtual positioning using different tool paths and insertion parameters until the virtual positioning meets approval criteria (Par. 0184; MR system 212 can be used by a surgeon before (e.g., preoperatively) or during the surgical procedure (e.g., intraoperatively) to create, review, verify, update, modify and/or implement a surgical plan…prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components…Par. 0237-0240 and Fig. 11A; placement and/or positioning of the implant components 1010 by simulating various different motions of the anatomy with the prosthetic implant implanted according to the preoperative surgical plan for the patient). Regarding claim 5, Poltaretskyi discloses the method of claim 1, and further discloses comprising performing a plurality of virtual simulations (Par. 0237-0245) for in-vivo assembly of the surgical implant (Par. 0184. Par. 0188. Par. 0324. Par. 0330-0333. Examiner’s note: Simulations are performed in the context of planning and confirming the implantation/installation of the multi-component prosthetic system that is installed within the patient. Multi-component implant models, surgically planned positioning, simulation of implanted component behavior, and intraoperative installation of implant components within the patient teaches the limitation of in-vivo assembly), wherein the implantation plan is generated based on the virtual simulations (Par. 0240; Visualization of the simulated ranges of motion using MR system 212 can help the surgeon confirm the surgical plan or may lead the surgeon to update or modify the preoperative surgical plan). Regarding claim 6, Poltaretskyi discloses the method of claim 1, and further discloses comprising mapping the AR environment to the anatomical features using a machine-learning platform (Par. 0211; the sensor data can be processed using a Simultaneous Localization and Mapping (SLAM) algorithm, or other known or future-developed algorithms for processing and mapping 2D and 3D image data and tracking the position of visualization device 213 in the 3D scene. In some examples, image tracking may be performed using sensor processing and tracking functionality provided by the Microsoft HOLOLENS™ system, e.g., by one or more sensors and processors 514 within a visualization device 213. Par. 0283; MR system 212 may use a machine learned model (i.e., use machine learning, such as a random forest algorithm) to process the image data and identify the location of the anatomy of interest), wherein the machine-learning platform comprises a plurality of surgery-type-specific machine learning modules to be applied to image data of the patient to provide anatomical surgery-type mapping (Par. 0196; changes made during the manual correction step may be used as training data to refine the machine learning techniques applied by virtual planning system 102 during the automatic processing step. Par. 0613-617; machine learning to learn and predict where a surgeon is within the given surgical procedure (e.g., which step) in order to predict the next step (by identifying the current step of a surgical plan) and to predict the next surgical item that is needed (based on the current step or the current surgical item being used)…By implementing a machine learning algorithm, processing device 8304 (or visualization device 213) may be configured to predict surgical items needed for a procedure following an interoperative change to that procedure…) Regarding claim 7, Poltaretskyi discloses the method of claim 1, and further discloses retrieving modeling parameters for generating the digital anatomical model (Par. 0178; a storage system 206, and a network 208 that allows a user at healthcare facility 204 to access stored patient information, such as medical history, image data corresponding to the damaged joint or bone and various parameters corresponding to a surgical plan that has been created preoperatively (as examples)), generating the digital anatomical model according to the modeling parameters (Par. 0243; a computing system, such as MR system 212 or preoperative surgical planning system 202 (FIG. 2) may generate, based on medical images of a patient, a 3-dimensional (3D) virtual model of a joint of the patient (1108). The joint may be various types of joints, such as the shoulder joint, ankle, knee, elbow, or wrist), identifying the anatomical features within the digital anatomical model (Par. 0861; computing system 12202 may segment the medical images to identify internal structures of the current patient, such as soft tissue and bone. For instance, in one example, computing system 12202 may apply an artificial neural network trained to identify portions of medical images that correspond to bones or soft tissue); and assigning anatomical characteristics to the identified anatomical features for display within the AR environment (FIG. 10-11A; example UI of a MR system. Par. 0184; MR system 212 may include a visualization device 213 that may be worn by the surgeon and (as will be explained in further detail below) is operable to display a variety of types of information, including a 3D virtual image of the patient’s diseased, damaged, or postsurgical joint and details of the surgical plan, such as a 3D virtual image of the prosthetic implant components selected for the surgical plan, 3D virtual images of entry points for positioning the prosthetic components, alignment axes and cutting planes for aligning cutting or reaming tools to shape the bone surfaces, or drilling tools to define one or more holes in the bone surfaces, in the surgical procedure to properly orient and position the prosthetic components, surgical guides and instruments and their placement on the damaged joint, and any other information that may be useful to the surgeon to implement the surgical plan. MR system 212 can generate images of this information that are perceptible to the user of the visualization device 213 before and/or during the surgical procedure). Regarding claim 8, claim 8 is the system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210) of method claim 1, and is accordingly rejected using substantially similar rationale as to that which is set for with respect to claim 1. Regarding claim 9, claim 9 has similar limitations as of claim 2, except it is a system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 2. Regarding claim 11, claim 11 has similar limitations as of claim 4, except it is a system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 4. Regarding claim 12, claim 12 has similar limitations as of claim 5, except it is a system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 5. Regarding claim 13, claim 13 has similar limitations as of claim 6, except it is a system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 6. Regarding claim 14, claim 14 has similar limitations as of claim 7, except it is a system claim (Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 7. Regarding claim 15, claim 15 is the CRM claim (Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206) of method claim 1, and is accordingly rejected using substantially similar rationale as to that which is set for with respect to claim 1. Regarding claim 16, claim 16 has similar limitations as of claim 2, except it is a CRM claim (Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206), therefore it is rejected under the same rationale as claim 2. Regarding claim 18, claim 18 has similar limitations as of claim 4, except it is a CRM claim (Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206), therefore it is rejected under the same rationale as claim 4. Regarding claim 19, claim 19 has similar limitations as of claim 5, except it is a CRM claim (Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206), therefore it is rejected under the same rationale as claim 5. Regarding claim 20, claim 20 has similar limitations as of claim 6, except it is a CRM claim (Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206), therefore it is rejected under the same rationale as claim 6. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 3, 10, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Poltaretskyi et al. (WO 2019245864), hereinafter referred to as “Poltaretskyi”, in view of Quiròs et al. (WO 2017175055), hereinafter referred to as “Quiròs”. Regarding claim 3, Poltaretskyi discloses the method of claim 1, and further discloses segmenting image data of the patient to identify the anatomical features (Par. 0861; computing system 12202 may segment the medical images to identify internal structures of the current patient, such as soft tissue and bone. For instance, in one example, computing system 12202 may apply an artificial neural network trained to identify portions of medical images that correspond to bones or soft tissue), and performing a virtual-reality (VR) simulation (Par. 0787; the orthopedic surgeon may use the VR visualization device to perform a simulation of the orthopedic surgery) with one or more anatomical identification prompts (Fig. 12 and Par. 0236; icons 1218 and 1220. Fig. 15A; menu 1510). Poltaretskyi does not disclose to label the anatomical features. In the same art of medical imaging systems, Quiròs discloses to label the anatomical features (Par. 00106 and Fig. 14B). It would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to combine the teachings of Poltaretskyi with Quiròs. The motivation lies in the advantage of providing users with clarity and accuracy of anatomical information, allowing users to not only identify the anatomical regions, but also to interactively label them in a virtual environment. This combination yields a predictable enhancement for simulation-based training and planning tools. Regarding claim 10, claim 10 has similar limitations as of claim 3, except it is a system claim (Poltaretskyi Par. 0178-0183 and Fig. 2; Processing device(s) 210), therefore it is rejected under the same rationale as claim 3. Regarding claim 17, claim 17 has similar limitations as of claim 3, except it is a CRM claim (Poltaretskyi Par. 0178-0183 and Fig. 2; Storage device(s) (M) 215 and storage system 206), therefore it is rejected under the same rationale as claim 3. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JENNY NGAN TRAN whose telephone number is (571)272-6888. The examiner can normally be reached Mon-Thurs 8am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JENNY N TRAN/Examiner, Art Unit 2615 /ALICIA M HARRINGTON/Supervisory Patent Examiner, Art Unit 2615
Read full office action

Prosecution Timeline

May 08, 2023
Application Filed
Jul 08, 2025
Non-Final Rejection — §102, §103
Sep 24, 2025
Applicant Interview (Telephonic)
Sep 24, 2025
Examiner Interview Summary
Oct 10, 2025
Response Filed
Dec 04, 2025
Final Rejection — §102, §103
Feb 10, 2026
Response after Non-Final Action
Mar 02, 2026
Request for Continued Examination
Mar 04, 2026
Response after Non-Final Action
Mar 20, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12499589
SYSTEMS AND METHODS FOR IMAGE GENERATION VIA DIFFUSION
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
20%
Grant Probability
70%
With Interview (+50.0%)
2y 6m
Median Time to Grant
High
PTA Risk
Based on 5 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month