Prosecution Insights
Last updated: April 19, 2026
Application No. 18/704,830

MIXED REALITY GUIDANCE OF ULTRASOUND PROBE

Non-Final OA §103
Filed
Apr 25, 2024
Examiner
ROY, BAISAKHI
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Howmedica Osteonics Corp.
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
4y 2m
To Grant
96%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
507 granted / 659 resolved
+6.9% vs TC avg
Strong +19% interview lift
Without
With
+19.2%
Interview Lift
resolved cases with interview
Typical timeline
4y 2m
Avg Prosecution
32 currently pending
Career history
691
Total Applications
across all art units

Statute-Specific Performance

§101
6.6%
-33.4% vs TC avg
§103
52.8%
+12.8% vs TC avg
§102
12.1%
-27.9% vs TC avg
§112
17.1%
-22.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 659 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-3, 5-17, and 19-29 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mahfouz (2021/0015559) in view of With respect to claims 1, 15, and 29, Mahfouz teaches of a system, method, and a non-transitory computer-readable storage media comprising instructions stored that are executed by a processing circuity for obtaining reference data depicting a bone of a patient or the pre-operative imaging and model of the patient’s anatomy such as the knee joint [0225]. Mahfouz teaches of determining a physical location of the ultrasound probe or tracking of the ultrasound probe or the surgical instruments [0224, fig. 44]. Mahfouz teaches of generating registration data that registers virtual locations on the bone of the patient as depicted in the reference data with corresponding physical locations on the bone of the patient or the virtual representations of the patient’s bones of the knee joint and the needle are overlaid the real-world images of the patient’s leg in a registered and aligned manner [0225]. Mahfouz further teaches of generating virtual guidance based on the reference data, registration data, and the physical location of the probe, wherein the virtual guidance provides guidance to a clinician regarding how the probe is positioned relative to a target location at which the probe able to generate second ultrasound data that provides information regarding a soft tissue structure of the patient or where the operator has direct, virtual line of sight to know precisely how far the needle is injected into the soft tissue of the joint and where it is located and visualize appropriate needle path and anatomical structures through the augmented display [0225, fig. 44]. Mahfouz teaches obtaining information on ligaments and soft tissue information from the bone models with the ultrasound system where the surgical plan includes resection depths and orientation information [0224]. Mahfouz teaches of a head-mounted Mixed Reality visualization device to output the virtual guidance to the clinician [fig. 44, AR helmet that includes a heads-up display]. With respect to claims 1, 2, 9, 15, 16, and 23, Mahfouz does not explicitly teach of generating the registration data based on the ultrasound data generated by the ultrasound probe. In a similar field of endeavor Kang teaches of a system and method of registration using ultrasound probe where the computing device generates registration points based on received ultrasound data and determines which of the ultrasonic transducer are providing acceptable signals [0018]. Kang therefore teaches of registration based on ultrasound data received from ultrasound probe and during the registration process, computing device determines the location and orientation of the two-dimensional array of transducers of the probe [0022-0024]. Kang further teaches the bone structure registration process with the ultrasonic data corresponding to the patient’s anatomy received from the transducer on a registration probe [0033]. With respect to claims 2 and 16, Kang teaches of generating registration data by obtaining ultrasound image based on the first ultrasound data generated by the ultrasound probe at a physical location and teaches of generating displacement data between the probe and the portion of the bone depicted in the image and generating the registration data based on the physical location of the probe and the displacement data or where the transducers are in contact or sufficiently close to the anatomy or the bone [0018, 0021] and generating registration data based on the physical location of the probe and the displacement or contact data [0029, 0030, 0033-0035]. With respect to claims 9 and 23, Kang teaches of obtaining an estimated model of the soft tissue structure such as a statistical model [0022], generating second registration data that registers virtual locations on the model of the soft tissue structure and corresponding physical locations on the bone and determined based on the second registration data and the location of the probe, a direction to move the probe so that the probe is at the target location or in contact with the anatomy [0034-0036]. It would have therefore been obvious to one of ordinary skill in the art to use the teaching by Kang to modify Mahfouz to obtain registration data directly from the ultrasound probe and generate registration points based on ultrasound data from the transducer in contact with the anatomy [Kang. 0033]. With respect to claims 3 and 17, Mahfouz in view of Kang teaches of generating bone curve data or statistical shape variations of the knee anatomy and matching with the femoral curvature [0282]. Mahfouz teaches of extracting normal curvatures by comparison with a database of shapes and kinematics and use the closest output from the database to optimize the match with the overlay of the patient specific shape information onto the image where the contact regions are tracked to extract joint curvature and soft tissue information [0254, 0255]. Therefore, under broadest reasonable interpretation, Mahfouz teaches of generating curvature data of the bone and searching the bone in the reference data for a curvature or shape that matches the curvature or shape of the bone as depicted in the image for optimal patellofemoral tracking [0254]. With respect to claims 5 and 19, Mahfouz teaches of the reference data or model to be created from CT scan of the bone [0131]. Kang also teaches of reference data or model of the bone to be based on CT scan ([0022], claim 11). With respect to claims 6 and 20, Mahfouz in view of Kang teaches of the reference data or model to be a three-dimensional model of the bone [Mahfouz, 0008, 0155, 0156]. With respect to claims 7 and 21, Mahfouz in view of Kang teaches of generating the virtual guidance by generating a virtual directional element that indicates a direction the clinician is to move the probe with the virtual ultrasound overlay on scanning path (Mahfouz, fig. 44, 0226). With respect to claims 8 and 22, Mahfouz in view of Kang teaches of MR visualization device to output the virtual guidance so that the virtual guidance appears to the clinician to be superimposed on the patient [Mahfouz, 0128, 0129, 0131, 0155, 0156, 0204-0206, 0228, 0230]. With respect to claims 10 and 24, Mahfouz teaches of generating the estimated model of the soft tissue structure as a statistical shape model of the tissue structures [Mahfouz, 0255, 0261]. Kang also teaches of utilizing statistically shaped model and method of bone morphing [0022]. With respect to claims 11 and 25, Mahfouz in view of Kang teaches of the soft tissue structure to be ligaments [0268, 0270-0274]. With respect to claims 12, 13, 26, and 27, Mahfouz in view of Kang teaches of the multiple virtual guides to instruct the operator how to move the probe so that the probe is in different target locations or precisely guiding the needle after insertion and provide additional information regarding various soft tissue structures of the patient [Mahfouz, 0224, 0225]. Mahfouz therefore teaches of using the virtual guides to allow the operator to visualize appropriate needle path and anatomical structures through the augmented display and the anatomical structures may be created through reconstruction of ultrasound data and rendered on the glasses in the appropriate position after registering the helmet to the ultrasound tracking system [0225]. Mahfouz also teaches of MR visualization device to output the virtual guidance so that the virtual guidance appears to the clinician to be superimposed on the patient [Mahfouz, 0128, 0129, 0131, 0155, 0156, 0204-0206, 0228, 0230]. With respect to claims 14 and 28, Mahfouz in view of Kang teaches the virtual guides providing indication of how to adjust the probe relative to the patient at the target location by providing a virtual resection plane on the AR visor where the user may manually adjust the cutting slot of the cutting guide [0148, 0210, 0225] and therefore under broadest reasonable interpretation, allowing adjustments to angles and orientation with respect to the probe relative to the patient. Claim(s) 4 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Mahfouz in view of Kang and further in view of Razzaque (2017/0024903). The previous references do not explicitly teach of determining the location of the probe based on data from the MRI visualization. In a related field of endeavor Razzaque teaches of determining location of a sensor associated with a medical device such as an ultrasound probe that is imaging a target structure such as a bone based on MRI image data [0012, 0014, 0016]. It would have therefore been obvious to one of ordinary skill in the art to use the teaching by Razzaque to modify the previous references to perform more effective or accurate surgery [Razzaque, 0002]. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to BAISAKHI ROY whose telephone number is (571)272-7139. The examiner can normally be reached Monday-Friday 7-3 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christopher Koharski can be reached at 571-272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. BR /BAISAKHI ROY/ Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

Apr 25, 2024
Application Filed
Jan 21, 2026
Non-Final Rejection — §103
Mar 06, 2026
Interview Requested
Mar 12, 2026
Examiner Interview Summary
Mar 12, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601803
SYSTEM FOR MACHINE LEARNING-BASED MODEL TRAINING AND PREDICTION FOR EVALUATION OF PAIN
2y 5m to grant Granted Apr 14, 2026
Patent 12594087
PRECISE EXTRACORPOREAL SHOCKWAVES DELIVERY
2y 5m to grant Granted Apr 07, 2026
Patent 12588829
APPARATUS, SYSTEMS, AND METHODS FOR LOCALIZING MARKERS OR TISSUE STRUCTURES WITHIN A BODY
2y 5m to grant Granted Mar 31, 2026
Patent 12582485
MULTIPLE-INPUT INSTRUMENT POSITION DETERMINATION
2y 5m to grant Granted Mar 24, 2026
Patent 12569265
DUAL MODE ACOUSTIC LITHOTRIPSY TRANSDUCER
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
96%
With Interview (+19.2%)
4y 2m
Median Time to Grant
Low
PTA Risk
Based on 659 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month