Prosecution Insights
Last updated: April 19, 2026
Application No. 18/645,629

AUGMENTED REALITY SYSTEM AND METHOD WITH PERIPROCEDURAL DATA ANALYTICS

Non-Final OA §101§102§103
Filed
Apr 25, 2024
Examiner
KIM, KAITLYN EUNJI
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Mediview Xr Inc.
OA Round
1 (Non-Final)
58%
Grant Probability
Moderate
1-2
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
7 granted / 12 resolved
-11.7% vs TC avg
Strong +66% interview lift
Without
With
+65.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
37 currently pending
Career history
49
Total Applications
across all art units

Statute-Specific Performance

§101
11.9%
-28.1% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
21.4%
-18.6% vs TC avg
§112
22.5%
-17.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 12 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-20 are pending in this application. Claims 19-20 are withdrawn, and Claims 1-18 have been examined on the merits. Election/Restrictions Claims 19-20 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected Invention Group II drawn to a method of performing and updating a medical procedure on a patient (Claim 19), Invention Group III drawn to an augmented reality system for enhancing precision and outcomes in medical procedures (Claim 20). The species restriction for Species Group 1 and Species Group 2 have been withdrawn. Applicant timely traversed the restriction (election) requirement in the reply filed on 10/14/2025. Applicant's election with traverse of Invention Group I (Claims 1-18), Species Group1A (Claim 10), and Species Group2A (Claim 16) in the reply filed on 10/14/2025 is acknowledged. The traversal is on the ground(s) that a serious burden does not exist for searching and examining all of claims 1-20. Further, Applicant argues the inventions all share the same fundamental inventive concept of an augmented reality system providing intraoperative guidance during medical procedures by displaying holographic representations, recording procedural metrics, and establishing feedback loops for continuous improvement. Further, it is argued that the alleged mutually exclusive characteristics of the Species groups does not exist, as Invention Group I involves real-time holographic adjustments but different aspects of the same feedback loop; Invention Group II represents complementary system capabilities that would naturally be implemented together. This is not found persuasive because although the inventive concept of an augmented reality system is the same, Inventions I and II have distinction as subcombinations, with invention II having separate utility, such as the three-dimensional model of a portion of an anatomy of the patient while invention I and III are also subcominations, with Invention III having separate utility, such as the machine-readable instructions and the communication interface. Inventions II and III are also distinct as subcombinations as they have separate utility, such as the machine-readable instructions for transforming the tracking data and the user interface for the practitioner to interact with the augmented reality system. The requirement is still deemed proper and is therefore made FINAL. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recites providing a procedural plan of the medical procedure, recording a metric during the medical procedure, updating the procedural plan, and using the updated procedural plan as the procedural plan. The limitation of providing a procedural plan of the medical procedure, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind i.e., using pen and paper or verbally providing a procedural plan. Further, (although it is not required) if these processes are performed by an “inherent” processor, these claimed steps could easily be performed by a generic computer component as the claimed limitations do not require any specialized processor. Similarly recording a metric during the medical procedure, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind i.e., using pen and paper to record a metric or mentally taking note of a metric. Similarly, the limitation of updating the procedural plan, and using the updated procedural plan as the procedural plan, are processes that, under its broadest reasonable interpretation, covers performance of the limitation in the mind i.e., mentally taking note of an updated plan or using and paper to write and use an updated plan. If a Claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea. The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using processors to perform the identifying and determining steps amounts to no more than mere instructions to apply the exception using generic processors. Mere instructions to apply an exception using generic processors cannot provide an inventive concept. The claim is not patent eligible. The other independent claim 20 also recites similar limitations as claim 8, which is also found to be not patent eligible at least for the reasons noted above. The dependent claims 2-18 are also directed to an abstract idea as the depending claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The elements in those claims do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea. Therefore, the depending claims, are, also not patent eligible. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Aguirre-Valencia (US20160381256A1). Regarding Claim 1, Aguirre-Valencia teaches a method of performing and updating a medical procedure on a patient by a practitioner using an augmented reality system (corresponding disclosure in at least [0170], where the method uses augmented reality and updates a surgical procedure “the set of 3D stereoscopic images may be dynamically updated during the surgical procedure if the surgeon deviates from the predefined surgical plan. In particular, the set of 3D stereoscopic images may be displayed on a 3D display (such as a monitor, an augmented-reality headset”), the method comprising: providing a procedural plan of the medical procedure (corresponding disclosure in at least [0169], where there is a plan provided of the procedure “the surgeon (or a physician) may visualize patient specific anatomy that has been identified as part of the preoperative plan at its patient specific location through the patient's body and plan out a series of tasks or operations they will perform on a patient during a planned surgical procedure”); performing the medical procedure on the patient utilizing the procedural plan and the augmented reality system (corresponding disclosure in at least [0170], where the procedure is being performed using augmented reality “In particular, the set of 3D stereoscopic images may be displayed on a 3D display (such as a monitor, an augmented-reality headset…Computer system 600 (FIG. 6) may step through the set of 3D stereoscopic images as the surgical procedure progresses”); recording a metric during the medical procedure using the augmented reality system to provide a recorded metric (corresponding disclosure in at least [0170], where there is a recorded metric (the location of the tool) “Computer system 600 (FIG. 6) may step through the set of 3D stereoscopic images as the surgical procedure progresses, e.g., based on user input via a user interface or voice commands, or based on information that specifies the surgeon's actions or location in patient's anatomy. For example, the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel)”); updating the procedural plan based on the recorded metric to provide an updated procedural plan (corresponding disclosure in at least [0170], where the plan is updated based on the recorded metric (analysis of surgical tool location) providing an updated plan “If the surgeon intentionally or unintentionally deviates from the predefined surgical plan (e.g., using on-the-fly or dynamic analysis of the location of a surgical tool relative to the predefined surgical plan), computer system 600 (FIG. 6) may update or revise the displayed 3D stereoscopic images. The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback)”); and using the updated procedural plan as the procedural plan, thereby establishing a feedback loop for the procedural plan of the medical procedure (corresponding disclosure in at least [0174] and Figure 16, where there is a feedback loop (the steps repeat) which updates the plan “Thus, when generating the stereoscopic images (operation 1614) or preparing the stereoscopic images, information from optionally tracked motion (operation 1610) and/or the optionally tracked interaction may be used to generate or revise the view and projection matrices”). PNG media_image1.png 498 316 media_image1.png Greyscale Figure 16 of Aguirre-Valencia Regarding Claim 2, Aguirre-Valencia further teaches identifying a difference between the procedural plan and the updated procedural plan (corresponding disclosure in at least [0170], where the plan is updated “Furthermore, the set of 3D stereoscopic images may be dynamically updated during the surgical procedure if the surgeon deviates from the predefined surgical plan” and further in [0171], where the plans and updates are all saved, so the differences can be identified “Before, during and/or at the end of the surgical procedure, the surgical plan, the locations of the surgical instruments and the progression of the surgical procedure, and the outcome may be saved in a computer-readable data structure for subsequent analysis and/or use”). Regarding Claim 3, Aguirre-Valencia further teaches providing the updated procedural plan as the procedural plan for a subsequent medical procedure (corresponding disclosure in at least [0171], where the updated plan can be saved for subsequent use “Before, during and/or at the end of the surgical procedure, the surgical plan, the locations of the surgical instruments and the progression of the surgical procedure, and the outcome may be saved in a computer-readable data structure for subsequent analysis and/or use”). Regarding Claim 4, Aguirre-Valencia further teaches wherein the augmented reality system provides intraoperative guidance during the medical procedure by providing an augmented reality display of an aspect of the procedural plan and tracking of a medical instrument (corresponding disclosure in at least [0170], where there is an augmented reality system using intraoperative guidance showing the plan and tracking of a medical instrument (surgical tool) “the set of 3D stereoscopic images may be displayed on a 3D display (such as a monitor, an augmented-reality headset… the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel)”). Regarding Claim 5, Aguirre-Valencia further teaches wherein the intraoperative guidance includes a step of alerting the practitioner to a deviation from the procedural plan (corresponding disclosure in at least [0170], where there is an alert when there is a deviation from the plan “The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback), the current location (relative to the patient's anatomy), and/or how to correct the deviation so that the surgical procedure follows the predefined surgical plan”). Regarding Claim 6, Aguirre-Valencia further teaches wherein the step of alerting includes visual or auditory cue (corresponding disclosure in at least [0170], where the alert is visual or auditory ““The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback)”). Regarding Claim 7, Aguirre-Valencia further teaches wherein providing the updated procedural plan as the procedural plan includes a step of adjusting a holographic representation by the augmented reality system (corresponding disclosure in at least [0050], where each step is updated via tracking “ one or more optional position sensors 116 (which may be separate from or integrated into display 114) may dynamically track movement of the head of viewer 122 with up to six degrees of freedom, and this head-tracking information (e.g., the positions of the eyes of viewer 122 relative to display 114) may be used by graphics engine 112 to update the view and frustum matrices and, thus, the rendered left-eye and right-eye images” and further in [0049], where it is specified the plans are demonstrated in a holographic representation “These images may be appropriately scaled or sized so that the images match the physical parameters of the viewing geometry (including the position of viewer 122 and size 126 of the display 114). This may facilitate the holographic effect for viewer”); and the method further includes a step of providing the updated procedural plan as the procedural plan for a subsequent medical procedure (corresponding disclosure in at least [0171], where the updated plan can be saved for subsequent use “Before, during and/or at the end of the surgical procedure, the surgical plan, the locations of the surgical instruments and the progression of the surgical procedure, and the outcome may be saved in a computer-readable data structure for subsequent analysis and/or use”). Regarding Claim 8, Aguirre-Valencia further teaches wherein the holographic representation includes a three- dimensional model of a portion of an anatomy of the patient derived from medical imaging data (corresponding disclosure in at least [0047], where there is an input of a medical imaging data “ data engine 110 may receive input data (such as a computed-tomography or CT scan, histology, an ultrasound image, a magnetic resonance imaging or MRI scan, or another type of 2D image slice depicting volumetric information), including dimensions and spatial resolution.” And further in [0049], where the data is then rendered to be 3D “These model, view and frustum matrices may be used by graphics engine 112 to render images of the 3D objects. For a given eye, the rendered image may provide a 2.5D monoscopic projection view on display 114. By sequentially displaying left-eye and right-eye images that include image parallax (i.e., stereoscopic images), 3D information may be presented on display”). Regarding Claim 9, Aguirre-Valencia further teaches wherein the recorded metric obtained during the medical procedure includes data selected from the group consisting of instrument positioning metrics, instrument interaction metrics, operative action metrics, outcome-related metrics, procedure efficiency metrics, and system interaction metrics (corresponding disclosure in at least [0170], where the recorded metric is the instrument positioning “ Computer system 600 (FIG. 6) may step through the set of 3D stereoscopic images as the surgical procedure progresses, e.g., based on user input via a user interface or voice commands, or based on information that specifies the surgeon's actions or location in patient's anatomy. For example, the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel), using: a local positioning system (such as a 3D local positioning system that tracks the position of one or more surgical tools in the patient, such as relative to one or more anatomical landmarks in the patient), image processing of images acquired in the operating room, etc.”). Regarding Claim 10, Aguirre-Valencia further teaches wherein providing the updated procedural plan as the procedural plan includes a step of adjusting a holographic representation by the augmented reality system based on the recorded metric (corresponding disclosure in at least [0170], where the holographic representation by the augmented reality system will be adjusted (the augmented reality headset) “the set of 3D stereoscopic images may be dynamically updated during the surgical procedure if the surgeon deviates from the predefined surgical plan. In particular, the set of 3D stereoscopic images may be displayed on a 3D display (such as a monitor, an augmented-reality headset”). Regarding Claim 11, Aguirre-Valencia further teaches wherein the augmented reality system includes a head- mounted display for rendering a holographic representation to the practitioner performing the medical procedure (corresponding disclosure in at least [0170], where the system includes a head-mounted display “the set of 3D stereoscopic images may be displayed on a 3D display (such as a monitor, an augmented-reality headset”)”). Regarding Claim 12, Aguirre-Valencia teaches wherein the head-mounted display is configured to overlay the holographic representation within a view by the practitioner of the patient receiving the medical procedure (corresponding disclosure in at least [0179], where the system will continue to update the overlay based on the perspective (view) of the user “ as the user interacts with the 3D stereoscopic images and/or the one or more 2D projections and changes their viewing perspective, the computer system may dynamically update the 3D stereoscopic images and the 2D projections based on the current perspective (operation 1812). In some embodiments, note that the 2D projections are always presented along a perspective direction perpendicular to the user so that motion parallax is registered in the 2D projections”). Regarding Claim 13, Aguirre-Valencia further teaches wherein providing the updated procedural plan as the procedural plan, thereby establishing the feedback loop for the procedural plan of the medical procedure, includes a step of analyzing a postoperative result to compare an outcome of the updated procedural plan with the procedural plan within the step of providing the procedural plan (corresponding disclosure in at least [0169], where the procedural plan (the preoperative plan) is compared with an updated plan by allowing surgeons to interact and mark the procedure “ visualize patient specific anatomy that has been identified as part of the preoperative plan at its patient specific location through the patient's body and plan out a series of tasks or operations they will perform on a patient during a planned surgical procedure (i.e., to determine a surgical plan). True 3D may allow the surgeon to view the 3D environment in the patient (including organs and landmarks in the context of the patient's anatomy) as the planned surgical procedure progresses. This may allow the surgeon to visualize the planned surgical procedure and to interact with the 3D stereoscopic images (which may include haptic interaction with a virtual instrument as a surgical tool) so that the surgeon can determine an efficient manner in which to perform the planned surgical procedure. The resulting set of 3D stereoscopic images (including image parallax, motion parallax, prehension and/or stereopsis scaling) may be saved by the surgeon for subsequent viewing before and/or during the surgical procedure”) and, wherein the step of analyzing the postoperative result further includes steps of ascertaining a placement of a medical instrument and ascertaining an effectiveness of the medical instrument (corresponding disclosure in at least [0170], where the placement of the medical instrument is ascertained “the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel)” and further where the system will indicate the most effective placement (correcting the deviation/showing the predefined plan if veered off-course) “ If the surgeon intentionally or unintentionally deviates from the predefined surgical plan (e.g., using on-the-fly or dynamic analysis of the location of a surgical tool relative to the predefined surgical plan), computer system 600 (FIG. 6) may update or revise the displayed 3D stereoscopic images. The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback), the current location (relative to the patient's anatomy), and/or how to correct the deviation so that the surgical procedure follows the predefined surgical plan”). Regarding Claim 14, Aguirre-Valencia further teaches wherein the augmented reality system is further configured to track a position and orientation of a medical instrument in real-time during the medical procedure (corresponding disclosure in at least [0170], where the position is tracked in real-time (dynamic tracking) “the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel), using: a local positioning system (such as a 3D local positioning system that tracks the position of one or more surgical tools in the patient, such as relative to one or more anatomical landmarks in the patient)”). Regarding Claim 15, Aguirre-Valencia further teaches wherein the augmented reality system is further configured to integrate and display data from an imaging system and a tracking system (corresponding disclosure in at least [0047], where integrated data includes one from an imaging system (i.e. CT) “ data engine 110 may receive input data (such as a computed-tomography or CT scan, histology, an ultrasound image, a magnetic resonance imaging or MRI scan, or another type of 2D image slice depicting volumetric information), including dimensions and spatial resolution” and further in [0050], where information for a tracking system (head tracking information from optional position sensors) are also used “one or more optional position sensors 116 (which may be separate from or integrated into display 114) may dynamically track movement of the head of viewer 122 with up to six degrees of freedom, and this head-tracking information (e.g., the positions of the eyes of viewer 122 relative to display 114) may be used by graphics engine”). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Aguirre-Valencia (US20160381256A1) in view of Liarno (US20230377714A1). Regarding Claim 16, Aguirre-Valencia teaches the limitations of Claim 1 and further teaches wherein the augmented reality system is further configured to provide a postoperative review interface to display the recorded metric relative to the procedural plan (corresponding disclosure in at least [0170], where the procedure with the progression is saved alongside the recorded metric relative to the procedural plan (location of the surgical instruments) “If the surgeon intentionally or unintentionally deviates from the predefined surgical plan (e.g., using on-the-fly or dynamic analysis of the location of a surgical tool relative to the predefined surgical plan), computer system 600 (FIG. 6) may update or revise the displayed 3D stereoscopic images. The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback), the current location (relative to the patient's anatomy), and/or how to correct the deviation so that the surgical procedure follows the predefined surgical plan” and further in [0171] “Before, during and/or at the end of the surgical procedure, the surgical plan, the locations of the surgical instruments and the progression of the surgical procedure, and the outcome may be saved in a computer-readable data structure for subsequent analysis and/or use”), but does not teach providing adjustment of the procedural plan. Liarno, in a similar field of endeavor, teaches a similar concept (adaptations for improving surgical plans) of providing adjustment of the procedural plan (corresponding disclosure in at least [0168], where given a postoperative review adjustments can be made “This postoperative plan 9030 may be newly generated based on postoperative data 3000 and/or may be a modification to the postoperative plan 8030 generated using the intraoperative data 2000 (and/or a manually input) and/or the postoperative plan 7030 generated using the preoperative data 1000 (and/or manually input)… manually input an adjustment to the postoperative plan 9030 via an electronic device. The postoperative plan 9030 may be continuously adjusted and/or updated as more postoperative data 3000 is collected”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated providing adjustments of the procedural plan as taught by Liarno. One of the ordinary skill in the art would have been motivated to incorporate this because it provides an improvement to the future plan for assisting further surgical procedures. Regarding Claim 17, Aguirre-Valencia teaches the limitations of Claim 1 and further teaches the augmented reality system and the holographic representation and intraoperative guidance of the procedural plan on a plurality of iterations of: recording the metric during the medical procedure using the augmented reality system (corresponding disclosure in at least [0170], where there is a recorded metric (the location of the tool) “Computer system 600 (FIG. 6) may step through the set of 3D stereoscopic images as the surgical procedure progresses, e.g., based on user input via a user interface or voice commands, or based on information that specifies the surgeon's actions or location in patient's anatomy. For example, the information may include dynamic tracking of the surgeon and/or one or more of the surgeon's surgical tools (such as a scalpel)”); updating the procedural plan based on the recorded metric (corresponding disclosure in at least [0170], where the plan is updated based on the recorded metric (analysis of surgical tool location) providing an updated plan “If the surgeon intentionally or unintentionally deviates from the predefined surgical plan (e.g., using on-the-fly or dynamic analysis of the location of a surgical tool relative to the predefined surgical plan), computer system 600 (FIG. 6) may update or revise the displayed 3D stereoscopic images. The revised 3D stereoscopic images may reflect or indicate: that a deviation from the predefined surgical plan has occurred (e.g., via visual, auditory and/or sensory feedback)”); and providing the updated procedural plan as the procedural plan, thereby establishing the feedback loop for the procedural plan of the medical procedure (corresponding disclosure in at least [0174] and Figure 16, where there is a feedback loop (the steps repeat) which updates the plan “Thus, when generating the stereoscopic images (operation 1614) or preparaing the stereoscopic images, information from optionally tracked motion (operation 1610) and/or the optionally tracked interaction may be used to generate or revise the view and projection matrices”). PNG media_image1.png 498 316 media_image1.png Greyscale Figure 16 of Aguirre-Valencia Aguirre-Valencia does not teach utilizing a machine learning algorithm. Liarno, in a similar field of endeavor, teaches utilizing a machine learning algorithm (corresponding disclosure in at least [0050], where a machine learning system is used for a constant feedback loop of inputting procedure information “ System 20 may be an artificial intelligence (AI) and/or machine learning system. System 20 may include an AI module 21 (shown in FIG. 2 ), which may include or communicate with a memory system 40 configured to store the plurality of inputs or input information 10, outputs or output information 30, and stored data 50 from prior patients and/or prior procedures”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated a machine learning algorithm as taught by Liarno. One of the ordinary skill in the art would have been motivated to incorporate this because the machine learning improves the efficiency of analyzing the multiple procedure information through a more automated process and accuracy. Regarding Claim 18, Aguirre-Valencia teaches the limitations of Claim 1 and further teaches the augmented reality system ([0170]), but does not teach to provide a preemptive action relative to the procedural plan using predictive analytics based on the step of providing the updated procedural plan as the procedural plan, thereby establishing the feedback loop for the procedural plan of the medical procedure. Liarno, in a similar field of endeavor, teaches to provide a preemptive action relative to the procedural plan using predictive analytics based on the step of providing the updated procedural plan as the procedural plan, thereby establishing the feedback loop for the procedural plan of the medical procedure (corresponding disclosure in at least disclosure in at least [0227], where predictions can be made for further actions (preemptive actions) based on the procedural outputs “The Finite Element Analysis algorithm 4040 may also be configured to assist postoperative algorithms 6000 in making determinations. The Finite Element Analysis algorithm 4040 may be configured to predict preoperative outputs 7000, intraoperative outputs 8000, and/or postoperative outputs 9000, and may further be configured to make determinations based on the predicted outputs 7000, 8000, 9000” with Figure 1 showing the process working in a feedback loop) . PNG media_image2.png 428 663 media_image2.png Greyscale Figure 1 of Liarno It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated a preemptive action using a predictive analysis as taught by Liarno. One of the ordinary skill in the art would have been motivated to incorporate this because further actions and next steps are determined by taking into account a consistently updated procedural plan, providing the most accurate medical procedure. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAITLYN KIM whose telephone number is (571)272-1821. The examiner can normally be reached Monday-Friday 6-2 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Kozak can be reached at (571) 270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.E.K./Examiner, Art Unit 3797 /SERKAN AKAR/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

Apr 25, 2024
Application Filed
Jan 02, 2026
Non-Final Rejection — §101, §102, §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+65.7%)
3y 2m
Median Time to Grant
Low
PTA Risk
Based on 12 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month