Prosecution Insights
Last updated: April 19, 2026
Application No. 17/206,552

COMPUTER-ASSISTED TRACKING SYSTEM USING ULTRASOUND

Final Rejection §103
Filed
Mar 19, 2021
Examiner
FARAG, AMAL ALY
Art Unit
3798
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Orthosoft Ulc
OA Round
6 (Final)
66%
Grant Probability
Favorable
7-8
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
131 granted / 197 resolved
-3.5% vs TC avg
Strong +38% interview lift
Without
With
+38.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
30 currently pending
Career history
227
Total Applications
across all art units

Statute-Specific Performance

§101
10.6%
-29.4% vs TC avg
§103
47.0%
+7.0% vs TC avg
§102
12.2%
-27.8% vs TC avg
§112
25.2%
-14.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 197 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This action is in response to the amendments and remarks filed on 12/04/2026. The amendments filed on 12/04/2026 have been entered. Accordingly Claims 1-3, 5-6, 9-15, 18, 20-21, 23-24 and 26-27 are pending. Claims 23 and 26 are withdrawn as provided in the previous office action. Amended claims 24 and 27 are rejoined. Claims 22 and 25 are canceled. The previous rejections of claims 1-3, 5-6, 9-15, 18, 20-21 have been withdrawn in light of Applicant’s amendments and remarks in the claim set filed 12/04/2026. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5-6, 9-15, 18, 20, 24, 27 are rejected under 35 U.S.C. 103 as being unpatentable over Kruse et. al. (U.S. 20170100092, April 13, 2017)(hereinafter, “Kruse”). Regarding Claim 1, Kruse teaches: An ultrasound tracking system for tracking a position and orientation of an anatomical feature in computer-assisted surgery (“The acoustic OTS 100 includes an acoustic transducer array and supporting structure 110…including an arm or leg extremity, head, neck, breast, torso or other body part, and used by the acoustic OTS 100 for acquiring acoustic imaging data, e.g., for producing an acoustic image, range-Doppler measurements, and/or feed into a therapeutic systems such as an orthopedic surgical system for affecting surgical operations.” [0059]; “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF.” [0067]) See Fig. 1C), the ultrasound tracking system comprising: an ultrasound imaging system having a plurality of phased-array ultrasound probe units distributed around said anatomical feature at one or more positions thereof, the phased- array ultrasound probe units being adapted for emitting ultrasound signals successively towards different portions of said anatomical feature (“FIG. 8A is an illustration depicting a simple arrangement of two transducers 800A and 800B transmitting and receiving echoes though soft tissue that contains a bone. For illustration purposes, the transducers 800A and 800B are arranged at 180 degrees with respect to each other" [0150], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF the position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100, e.g., including the position data of individual array elements 111 disposed in the structure 110.” [0067]), measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets (“…the position tracking device 146 can measure the position of the acoustic transducer array structure 110 by employing a non-contact sensor of the device 146 to obtain data of the structure 110 and/or body part to which the structure 110 is attached.” [0067];” the 6 DoF location of each transducer is measured dynamically with respect to one or more external spatial reference points.” [0077]);” provide measurement data that can be observed from a test dashboard that includes charts showing the returned signals from each of the transducer elements as well as 6 DoF data in a numeric or graphical format.” [0238]); a coordinate tracking system tracking coordinates of said ultrasound phased array probe unit during said measuring, and generating corresponding coordinate datasets (“…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF…The position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100…” [0067]); and a controller being communicatively coupled to said ultrasound imaging system and said coordinate tracking system, said controller having a processor and a memory having stored thereon instructions that when executed by said processor perform the steps of (“…the data processing unit 144 include a processor 182 to process data and a memory 184 in communication with the processor 182 to store data. For example, the processor 182 can include a central processing unit (CPU) or a microcontroller unit (MCU).” [0066];“…includes a transmit/receive electronics module (TREM) 110E in electrical communication with an acoustic probe device 120E … TREM 110E includes a system control unit to control the waveform generator unit for the synthesis of individual coded waveforms.” [0060], “…includes a connector 130 (e.g., cable) in communication with the acoustic transducer array structure 110 and the signal interface module 120” [0061], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part” [0067]): outputting a sectional mapping of the anatomical feature based on the imaged echo datasets (“ FIGS. 3A and 3B show various three dimensional views of the acoustic OTS 100 attached to the femur of a subject, which could be during a diagnostic and/or therapeutic procedure. FIG. 3C shows 3D view of a portion of the disclosed system employing two arrays of transducers per leg, including attaching two acoustic transducer array structures 110 on the leg: one for tracking the tibia and one for tracking the femur. FIG. 3D shows a 3D view of a break-out diagram depicting the example acoustic OTS 100 and the acoustic coupler 112 with respect to the subject.” [0071]); registering said sectional mapping of the anatomical feature in a common coordinate system based on said coordinate datasets (“Positional data can be referenced to a coordinate system that is registered to 3D models of the bones to be tracked for robotic assisted orthopedic surgeries. In some implementations, for example, the 3D bone models can be prepared prior to the surgical operation by CT scan or other 3D imaging modality as part of the surgical planning process.” [0227]; “…Register the 3PSNS system to spatial fiducial points on the ultrasound arrays; (5) Register the 3PSNS system with the acoustic OTS bone tracking to appropriate reference locations on the cortical surfaces of the femur and tibia” [0258]); and tracking said position and orientation of said anatomical feature based on said registering (“Positional data can be referenced to a coordinate system that is registered to 3D models of the bones to be tracked for robotic assisted orthopedic surgeries. In some implementations, for example, the 3D bone models can be prepared prior to the surgical operation by CT scan or other 3D imaging modality as part of the surgical planning process.” [0227]; “Once the OTS software determines a sufficient match to the 3D solid model, it automatically transitions to Track mode and begins reporting 6 DoF positional data to the STS via the IEEE 1394 serial interface. The message stream includes a health status word with each frame indicating the validity of the measurement of the of the bone positions relative to the arrays.” [0241]); wherein the anatomical feature is tracked, such that the tracking of the position and the orientation of the anatomical feature is done solely with the ultrasound imaging system (“…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF…The position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100…” [0067]). Kruse does not teach the tracking is without invasive tracking tool. Mihailescu in the field of image tracking systems teaches various examples of fiducial objects (e.g. Figs. 4-5, 9 and 16) in connection with imaging tracking systems that can be mounted on other sensors, instruments and/or on a patient (i.e. invasive) and/or laid on the patient (non-invasive) [0097-0098][0104][0106-0108][0111][0146][0227]. Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the invasive tracking tool in Kruse with a non-invasive tracking tool as taught in Mihailescu to improve patient comfort with lower risk of compromising the subject. Regarding Claim 2, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said registering includes generating an anatomical feature model of said anatomical feature based at least on said imaged echo datasets, and registering said anatomical feature model in said coordinate system based on said coordinate datasets (“…includes a transmit/receive electronics module (TREM) 110E in electrical communication with an acoustic probe device 120E … TREM 110E includes a system control unit to control the waveform generator unit for the synthesis of individual coded waveforms.” [0060], “includes a connector 130 (e.g., cable) in communication with the acoustic transducer array structure 110 and the signal interface module 120” [0061], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part” [0067]; “Positional data can be referenced to a coordinate system that is registered to 3D models of the bones to be tracked for robotic assisted orthopedic surgeries. In some implementations, for example, the 3D bone models can be prepared prior to the surgical operation by CT scan or other 3D imaging modality as part of the surgical planning process.” [0227]; “Once the OTS software determines a sufficient match to the 3D solid model, it automatically transitions to Track mode and begins reporting 6 DoF positional data to the STS via the IEEE 1394 serial interface. The message stream includes a health status word with each frame indicating the validity of the measurement of the of the bone positions relative to the arrays.” [0241]). Regarding Claim 3, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said generating said anatomical feature model includes accessing a reference model base, said generating said anatomical feature model being further based on said reference model base, said reference model base being selected from a group consisting of: a tibia model base, a femur model base (“FIG. 3C shows 3D view of a portion of the disclosed system employing two arrays of transducers per leg, including attaching two acoustic transducer array structures 110 on the leg: one for tracking the tibia and one for tracking the femur.” [0071]; “…the disclosed acoustic OTS can provide a third party surgical navigation system (3PSNS) with relative positional data intra-operatively for human femur and tibia bone targets.” [0226]; “The OTS provides positional and angular reference data to the example electro-optical SNS allowing the 3PSNS to determine the position of the 3PSNS optical arrays with respect to the patient's bone coordinate system” [0228]). Regarding Claim 5, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said plurality of phased-array ultrasound probe units includes at least two phase-array ultrasound probe units axially or circumferentially spaced-apart from one another relative to the anatomical feature, each proximate a respective portion of said anatomical feature and generating said imaged echo datasets, said coordinate tracking system tracking coordinates of each one of said at least two spaced-apart phased-array ultrasound probe units during said measuring (“FIG. 8A is an illustration depicting a simple arrangement of two transducers 800A and 800B transmitting and receiving echoes though soft tissue that contains a bone. For illustration purposes, the transducers 800A and 800B are arranged at 180 degrees with respect to each other" [0150], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF the position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100, e.g., including the position data of individual array elements 111 disposed in the structure 110.” [0067]), measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets (“…the position tracking device 146 can measure the position of the acoustic transducer array structure 110 by employing a non-contact sensor of the device 146 to obtain data of the structure 110 and/or body part to which the structure 110 is attached.” [0067];” the 6 DoF location of each transducer is measured dynamically with respect to one or more external spatial reference points.” [0077]);” provide measurement data that can be observed from a test dashboard that includes charts showing the returned signals from each of the transducer elements as well as 6 DoF data in a numeric or graphical format.” [0238]). Regarding Claim 6, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: further comprising displaying said tracked anatomical feature in real-time during the computer-assisted surgery (“…a plurality of ultrasound transducers of the acoustic array structure transmit and receive ultrasound waves that are used to track the shape, location, and movement of bone in real-time.” [0005]). Regarding Claim 9, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said coordinate tracking system is an optical coordinate tracking system having a reference tracker mounted to said ultrasound imaging system, and a camera imaging at least said reference tracker during said measuring, said optical coordinate tracking system optically tracking said coordinates of said ultrasound imaging system based on said camera imaging (“Examples of the sensor of the position tracking device 146 can include, but is not limited to, an optical sensor (e.g., video camera, CCD, LED, etc.), a magnetic sensor (e.g., magnetometer, Hall effect sensor, MEMS-based magnetic field sensor, etc.), rate sensor (e.g., gyro sensor, accelerometer, etc.), and/or electromagnetic, radio-frequency, and/or microwave sensors, or other detectors. The position tracking device 146 is configured to provide the data processing unit 144 with processed coordinate information or with the raw sensor data for the data processing unit 144 to process to produce the coordinate information of the structure 111 and/or body part. The data processing unit 144 is operable to process the coordinate information with the received acoustic echoes obtained from the acoustic OTS 100 to generate 6 DoF coordinate estimates of the bone location, error estimates, acoustic images, and other relevant parameters of the orthopedic feature of the subject, e.g., with an update rate of 1 kHz or higher.” [0067]; “One example of the position tracking device 146 can include the Stryker Surgical Navigation System (SNS), e.g., such as the Stryker NAV3i Platform. The Stryker NAV3i Platform includes digital camera technology, positional data processing devices, and multiple visual display monitors for tracking in real time. For example, the Stryker NAV3i Platform includes a navigation camera arm with one or more cameras (e.g., Built-in LiveCam) for imaging over a large range of motion, e.g., to accommodate various procedures and approaches.” [0068]). Regarding Claim 10, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said coordinate tracking system is a mechanical coordinate tracking system having a frame to which said ultrasound imaging system is movably mounted, and at least a sensor sensing a relative movement between said ultrasound imaging system and said frame, said mechanical coordinate tracking system tracking said coordinates of said ultrasound imaging system based on said sensed relative movement (“Examples of the sensor of the position tracking device 146 can include, but is not limited to, an optical sensor (e.g., video camera, CCD, LED, etc.), a magnetic sensor (e.g., magnetometer, Hall effect sensor, MEMS-based magnetic field sensor, etc.), rate sensor (e.g., gyro sensor, accelerometer, etc.), and/or electromagnetic, radio-frequency, and/or microwave sensors, or other detectors. The position tracking device 146 is configured to provide the data processing unit 144 with processed coordinate information or with the raw sensor data for the data processing unit 144 to process to produce the coordinate information of the structure 111 and/or body part. The data processing unit 144 is operable to process the coordinate information with the received acoustic echoes obtained from the acoustic OTS 100 to generate 6 DoF coordinate estimates of the bone location, error estimates, acoustic images, and other relevant parameters of the orthopedic feature of the subject, e.g., with an update rate of 1 kHz or higher.” [0067]; “One example of the position tracking device 146 can include the Stryker Surgical Navigation System (SNS), e.g., such as the Stryker NAV3i Platform. The Stryker NAV3i Platform includes digital camera technology, positional data processing devices, and multiple visual display monitors for tracking in real time. For example, the Stryker NAV3i Platform includes a navigation camera arm with one or more cameras (e.g., Built-in LiveCam) for imaging over a large range of motion, e.g., to accommodate various procedures and approaches.” [0068]). Regarding Claim 11, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: further comprising a wearable element having an ultrasound imaging interface made of a solid acoustically transmissive material through which said ultrasound signals are propagated (“The acoustic OTS 100 includes an acoustic transducer array and supporting structure 110, also referred to as the “structure”, configured to conform to a user's body, including an arm or leg extremity, head, neck, breast, torso or other body part, and used by the acoustic OTS 100 for acquiring acoustic imaging data, e.g., for producing an acoustic image, range-Doppler measurements, and/or feed into a therapeutic systems such as an orthopedic surgical system for affecting surgical operations.” [0059]; “…the data processing unit 144 may be resident on one or more computers (e.g., desktop computer, laptop computer, a network of computer devices in data communication with each other via the Internet (e.g., in the ‘cloud’), or other computing device including, but not limited to, a smartphone, tablet, or wearable computing/communications device).” [0063]. See Figs. 3A-3C). Regarding Claim 12, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said ultrasound imaging interface includes a surgery opening in the wearable element allowing access to said anatomical feature during said computer-assisted surgery (“The structure 110 can be configured to be curved or approximately curved, for example, as in the shape of a circle or ellipsoid. The curved structure may be open, for example, to cover 120 or 270 degrees around the body part. The opening may provide utility for accessing specific regions of the body,” [0073]). Regarding Claim 13, Kruse teaches: A method for tracking a position and orientation of an anatomical feature in computer-assisted surgery (“…a method for orthopedic tracking includes transmitting ultrasound pulses from a plurality of ultrasound transducers…” [0006]; “…the present technology can be used to track the tibia and femur bones in the leg during computer assisted surgery (CAS) of the knee, including, but not limited to, total knee arthroplasty (TKA) and total knee replacement (TKR).” [0007]), the method comprising: using an ultrasound imaging system has a plurality of phased-array ultrasound probe units distributed around said anatomical feature at one or more positions thereof, emitting phased-array ultrasound signals towards different portions of said anatomical feature, measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets while tracking coordinates of said ultrasound imaging system, and generating corresponding coordinate datasets (“FIG. 8A is an illustration depicting a simple arrangement of two transducers 800A and 800B transmitting and receiving echoes though soft tissue that contains a bone. For illustration purposes, the transducers 800A and 800B are arranged at 180 degrees with respect to each other" [0150], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF the position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100, e.g., including the position data of individual array elements 111 disposed in the structure 110.” [0067]), measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets (“…the position tracking device 146 can measure the position of the acoustic transducer array structure 110 by employing a non-contact sensor of the device 146 to obtain data of the structure 110 and/or body part to which the structure 110 is attached.” [0067];” the 6 DoF location of each transducer is measured dynamically with respect to one or more external spatial reference points.” [0077]);” provide measurement data that can be observed from a test dashboard that includes charts showing the returned signals from each of the transducer elements as well as 6 DoF data in a numeric or graphical format.” [0238]); and a controller performing the steps of (“…the data processing unit 144 include a processor 182 to process data and a memory 184 in communication with the processor 182 to store data. For example, the processor 182 can include a central processing unit (CPU) or a microcontroller unit (MCU).” [0066];“…includes a transmit/receive electronics module (TREM) 110E in electrical communication with an acoustic probe device 120E … TREM 110E includes a system control unit to control the waveform generator unit for the synthesis of individual coded waveforms.” [0060], “…includes a connector 130 (e.g., cable) in communication with the acoustic transducer array structure 110 and the signal interface module 120” [0061], “…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part” [0067]): outputting a sectional mapping of the anatomical feature based on the imaged echo datasets (“ FIGS. 3A and 3B show various three dimensional views of the acoustic OTS 100 attached to the femur of a subject, which could be during a diagnostic and/or therapeutic procedure. FIG. 3C shows 3D view of a portion of the disclosed system employing two arrays of transducers per leg, including attaching two acoustic transducer array structures 110 on the leg: one for tracking the tibia and one for tracking the femur. FIG. 3D shows a 3D view of a break-out diagram depicting the example acoustic OTS 100 and the acoustic coupler 112 with respect to the subject.” [0071]); registering said sectional mapping of the anatomical feature in a common coordinate system based on said coordinate datasets; and tracking said position and orientation of said anatomical feature based on said registering (“Positional data can be referenced to a coordinate system that is registered to 3D models of the bones to be tracked for robotic assisted orthopedic surgeries. In some implementations, for example, the 3D bone models can be prepared prior to the surgical operation by CT scan or other 3D imaging modality as part of the surgical planning process.” [0227]; “…Register the 3PSNS system to spatial fiducial points on the ultrasound arrays; (5) Register the 3PSNS system with the acoustic OTS bone tracking to appropriate reference locations on the cortical surfaces of the femur and tibia” [0258]); wherein the anatomical feature is tracked without any said invasive tracking tool, such that the tracking of the position and the orientation of the anatomical feature is done solely with the ultrasound imaging system (“…the system includes a position tracking device 146 to provide data to the data processing unit 144 used to determine the location coordinates, orientation, and other position and motion information of orthopedic structures of the body part with 6 DoF…The position tracking device 146 is operable to track the position of the acoustic transducer array structure 110 of the acoustic OTS 100…” [0067]). Kruse does not teach the tracking is without invasive tracking tool. Mihailescu in the field of image tracking systems teaches various examples of fiducial objects (e.g. Figs. 4-5, 9 and 16) in connection with imaging tracking systems that can be mounted on other sensors, instruments and/or on a patient (i.e. invasive) and/or laid on the patient (non-invasive) [0097-0098][0104][0106-0108][0111][0146][0227]. Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the invasive tracking tool in Kruse with a non-invasive tracking tool as taught in Mihailescu to improve patient comfort with lower risk of compromising the subject. Regarding Claim 14, Kruse in view of Mihailescu teach the claim limitations as noted above. Claim 14 further recites limitations: wherein said registering includes generating an anatomical feature model of said anatomical feature based at least on said imaged echo datasets, and registering said anatomical feature model in said coordinate system based on said coordinate datasets. These limitations are present in claim 2 and is therefore, rejected under the same rationale. Regarding Claim 15, Kruse in view of Mihailescu teach the claim limitations as noted above. Claim 15 further recites limitations: wherein said generating said anatomical feature model includes accessing a reference model base, said generating said anatomical feature model being further based on said reference model base, said reference model base being selected from a group consisting of: a tibia model base, a femur model base. These limitations are present in claim 3 and is therefore, rejected under the same rationale. Regarding Claim 18, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein said tracking includes optically tracking a reference tracker mounted to said ultrasound imaging system during said measuring (“Examples of the sensor of the position tracking device 146 can include, but is not limited to, an optical sensor (e.g., video camera, CCD, LED, etc.), a magnetic sensor (e.g., magnetometer, Hall effect sensor, MEMS-based magnetic field sensor, etc.), rate sensor (e.g., gyro sensor, accelerometer, etc.), and/or electromagnetic, radio-frequency, and/or microwave sensors, or other detectors. The position tracking device 146 is configured to provide the data processing unit 144 with processed coordinate information or with the raw sensor data for the data processing unit 144 to process to produce the coordinate information of the structure 111 and/or body part. The data processing unit 144 is operable to process the coordinate information with the received acoustic echoes obtained from the acoustic OTS 100 to generate 6 DoF coordinate estimates of the bone location, error estimates, acoustic images, and other relevant parameters of the orthopedic feature of the subject, e.g., with an update rate of 1 kHz or higher.” [0067]; “One example of the position tracking device 146 can include the Stryker Surgical Navigation System (SNS), e.g., such as the Stryker NAV3i Platform. The Stryker NAV3i Platform includes digital camera technology, positional data processing devices, and multiple visual display monitors for tracking in real time. For example, the Stryker NAV3i Platform includes a navigation camera arm with one or more cameras (e.g., Built-in LiveCam) for imaging over a large range of motion, e.g., to accommodate various procedures and approaches.” [0068]). Regarding Claim 20, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: further comprising sandwiching a solid acoustically transmissive material between said ultrasound imaging system and an outer-skin surface proximate to said anatomical feature, said solid acoustically transmissive material being provided in the form of a wearable element (“The acoustic OTS 100 includes an acoustic transducer array and supporting structure 110, also referred to as the “structure”, configured to conform to a user's body, including an arm or leg extremity, head, neck, breast, torso or other body part, and used by the acoustic OTS 100 for acquiring acoustic imaging data, e.g., for producing an acoustic image, range-Doppler measurements, and/or feed into a therapeutic systems such as an orthopedic surgical system for affecting surgical operations.” [0059]; “…the data processing unit 144 may be resident on one or more computers (e.g., desktop computer, laptop computer, a network of computer devices in data communication with each other via the Internet (e.g., in the ‘cloud’), or other computing device including, but not limited to, a smartphone, tablet, or wearable computing/communications device).” [0063]. See Figs. 3A-3C). Regarding Claim 24, Kruse in view of Mihailescu teach the claim limitations as noted above. Kruse further teaches: wherein the phased-array ultrasound probe units are mounted to a frame in accordance with a known geometric relationship relative to said anatomical feature (“The acoustic transducer array structure 110 includes an array of transducer elements and a housing body 119 to contain and position the transducer elements 111 for transmitting and receiving acoustic signals to/from a mass to which the acoustic transducer array structure 110 is applied. The housing body 119 includes a curved section where the transducer elements 111 of the acoustic transmit and/or receive transducer array are positioned, where the curved section of the housing body 119 can be configured to various sizes and/or curvatures tailored to a particular body region or part where the structure 110 is to be applied in acoustic imaging, measurement, or other implementations. For example, the length, depth, and arc of the curved housing body 119 can be configured to make complete contact with a region of interest on an anatomical structure…” [0069]; “ The transducer elements 111 are attached to the housing body 119 via a flexible bracket 118. The acoustic coupler 112 is able to conform directly onto the face of the transducer element 111, as illustrated in the diagram. In this example, the acoustic coupler 112 is attached to clip components of the flexible bracket 118 by an adhesive 113 on the external surface of the clips, e.g., to align in contact with the ‘tacky regions’ of the hydrogel and/or outer lining of the acoustic coupler 112. The clips are configured to attach around the lip of the housing body 119 to provide direct contact between the acoustic coupler 112 and the face 111B of the transducer element 111.” [0070]). Regarding Claim 27, Kruse in view of Mihailescu teach the claim limitations as noted above. Claim 27 further recites limitations: wherein the phased-array ultrasound probe units are mounted to a frame in accordance with a known geometric relationship relative to said anatomical feature. These limitations are present in claim 24 and is therefore, rejected under the same rationale. Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Kruse in view of Mihailescu as applied to claim 13 above, and further in view of Schwartz (US20200107770) in view of Mihailescu et. al. (U.S. 20130237811, September 12, 2013)(hereinafter, “Mihailescu”). Regarding Claim 21, Kruse in view of Mihailescu teach the claim limitations as noted above. The combination does not teach a time stamping. Schwartz in the field of computer assisted surgery teaches: wherein said imaged echo datasets have corresponding time stamps identifying at what moment in time the echo signals they represent have been measured, said registering being further based on said time stamps (“wherein the processor is configured to create and generate a temporally-changing coordinate map over time based on, for each of the messages, the transmission time stamp that the message was transmitted, a receipt time stamp that the message was received and the known initial positional information”) . Therefore, it would be obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the combination of references to include time stamping as taught by Schwartz because the temporal and the anatomic map can be used separately or together to detect activity that diminishes over time. The thyroid activity that diminishes over time can be subtracted and the parathyroid activity that increases over time can be enhanced. Similar activity or variable activity can be modeled appropriately to reflect patterns manifested by the parathyroid and thyroid. [0015]. Response to Arguments Applicant’s arguments with respect to amended claims 1 and 13 have been considered but are moot because of the new grounds of rejection applied. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMAL FARAG whose telephone number is (571)270-3432. The examiner can normally be reached 8:30 - 5:30 M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith Raymond can be reached at (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMAL ALY FARAG/Primary Examiner, Art Unit 3798
Read full office action

Prosecution Timeline

Mar 19, 2021
Application Filed
Apr 12, 2021
Response after Non-Final Action
Jan 27, 2023
Non-Final Rejection — §103
May 08, 2023
Response Filed
Sep 18, 2023
Final Rejection — §103
Nov 22, 2023
Response after Non-Final Action
Feb 26, 2024
Request for Continued Examination
Mar 06, 2024
Response after Non-Final Action
Sep 01, 2024
Non-Final Rejection — §103
Dec 05, 2024
Response Filed
Mar 20, 2025
Final Rejection — §103
May 21, 2025
Response after Non-Final Action
Jul 22, 2025
Request for Continued Examination
Jul 25, 2025
Response after Non-Final Action
Sep 04, 2025
Non-Final Rejection — §103
Dec 04, 2025
Response Filed
Mar 17, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12575744
DATA PROCESSING DEVICE AND METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12569220
BLOOD FLOW MEASUREMENT SYSTEM
2y 5m to grant Granted Mar 10, 2026
Patent 12564373
Spatially Aware Medical Device Configured for Performance of Insertion Pathway Approximation
2y 5m to grant Granted Mar 03, 2026
Patent 12564386
PROCESSING APPARATUS AND CONTROL METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12564387
ULTRASOUND DIAGNOSTIC APPARATUS AND ULTRASOUND DIAGNOSTIC SYSTEM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
66%
Grant Probability
99%
With Interview (+38.3%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 197 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month