Prosecution Insights
Last updated: April 18, 2026
Application No. 18/251,429

SCANNER FOR INTRAOPERATIVE APPLICATION

Final Rejection §103
Filed
May 02, 2023
Examiner
KIM, KAITLYN EUNJI
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Kico Knee Innovation Company Pty Limited
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
7 granted / 12 resolved
-11.7% vs TC avg
Strong +66% interview lift
Without
With
+65.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
37 currently pending
Career history
49
Total Applications
across all art units

Statute-Specific Performance

§101
11.9%
-28.1% vs TC avg
§103
42.2%
+2.2% vs TC avg
§102
21.4%
-18.6% vs TC avg
§112
22.5%
-17.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 12 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-18 and 21 are pending in this application, claims 14-18 are withdrawn, and Claims 1-13 and 21 are examined. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 4-13, and 21 are rejected under 35 U.S.C. 103 as being unpatentable over Charron (US20190117318A1) in view of Kang (WO2014093480A1) and in further view of Amanatullah (US20190231433A1). Regarding Claim 1, Charron teaches a tissue scanning system comprising: a depth sensor configured to determine distance to a surface (corresponding disclosure in at least [0026], where there is a mobile unit tracking marker (depth sensor) which determines the distance/relative location relative to a surgical site “mobile unit tracking marker fixedly disposed or disposable in relation to the image capture device and externally trackable by a tracking sensor fixedly disposable at a distance therefrom so to sense and thereby track a relative location of the image capture device relative to the surgical site and associate an intraoperative surgical site location with the image based at least in part on the relative location” further in [0071], where it’s specified there is a 3D tracking system, which would capture depth “ the tracking system 213 comprises a three-dimensional (3D) optical tracking stereo camera, such as a Northern Digital Imaging® (NDI) optical tracking stereo camera, which can be configured to locate reflective sphere tracking markers 512”); a pointer device (corresponding disclosure in at least [0091] and Figure 6a, where there is a pointer device “ the tool 601 comprises a rigid pointer or pointing tool 600 rigidly coupled to a set of tracking markers 610 fixedly disposed”); PNG media_image1.png 225 345 media_image1.png Greyscale Figure 6a of Charron a camera-based tracking system configured to determine relative orientation and position between an anatomical feature and the pointer device (corresponding disclosure in at least [0065], where there is a camera with the device for sensing “the mobile unit 505 may generally comprise one or more imaging sensors (e.g. see image capture sensor or camera 560” and [0008], where the system is for position and orientation tracking of features “provide real-time visuals of a surgical site while also providing for an accurate tracking of the unit's position and orientation relative to the surgical site by leveraging tracking features”) and a processing device (corresponding disclosure in at least [0056] “components of a surgical navigation system, such as a control and processing unit, a tracking system”). Charron does not teach generating a surface point cloud of a surface associated with the anatomical feature based on a plurality of determined distances from the depth sensor and corresponding relative orientation and position of the pointer device relative to the anatomical feature. Kang, in a similar field of endeavor, teaches a similar concept (surgical navigation systems) of generating a surface point cloud of a surface associated with the anatomical feature based on a plurality of determined distances from the depth sensor and corresponding relative orientation and position of the pointer device relative to the anatomical feature (corresponding disclosure in at least [0029], where point clouds are generated associated with anatomical features (the bones) for 3D representation and registration, which would show the distance and orientation/position “A point cloud is created using the numerous points of contact. The point cloud can then be aligned with the three-dimensional representation of bone 44, which was created preoperatively, to register the bone 44 to the three-dimensional representation”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated generating a surface point cloud of a surface with the anatomical features as taught by Kang. One of the ordinary skill in the art would have been motivated to incorporate this because using the surface point clouds finds the different location points of the anatomy and serve to measure the characteristics, including the position and depth of the surrounding area. Charron and Kang do not teach wherein the depth sensor is a distinct component from the camera-based tracking system. Amanatullah, in a similar field of endeavor, teaches a similar concept (surgical tracking) of wherein the depth sensor is a distinct component from the camera-based tracking system (corresponding disclosure in at least [0014], where there is a distinct depth sensor component “the computer system can access optical scan data from an optical sensor (e.g., a LIDAR or other depth sensor…”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated a distinct depth sensor component as taught by Amanatullah. One of the ordinary skill in the art would have been motivated to incorporate this because a separate sensor can capture further information regarding the three dimensional environment using information from lights or lasers. Regarding Claim 4, the combined references noted above teach the limitations of Claim 1, and Charron further teaches wherein the pointer device includes a guide tip, wherein the relative position of the guide tip to the depth sensor is fixed, or selectively fixed, during use (corresponding disclosure in at least [0026], where the tracking marker (depth sensor) is disposed or fixed onto a the device which includes a guide tip, and is fixed there “a mobile unit tracking marker fixedly disposed or disposable in relation to the image capture device and externally trackable by a tracking sensor fixedly disposable at a distance therefrom so to sense and thereby track a relative location of the image capture device relative to the surgical site”). Regarding Claim 5, the combined references noted above teach the limitations of Claim 1, and Charron further teaches wherein a relative distance between the guide tip and the depth sensor is selected to be within a desired operating range of the depth sensor (corresponding disclosure in at least [0084], where the position of the tracking tool is movable and can be at different configurations in relation to the guide tip “the position and orientation in 3D of a tracked instrument, such as a tool, is determinable… a tracked tool may invoke not only general tracking, but also tracking, for example, of the tool's tip or body, and any sensors, as will be detailed below, that may be operatively coupled thereto in a designated configuration (e.g. at or near a tool tip, angled relative to a tool tip or shaft, displaced and/or angled relative to other tool-mounted sensors, etc., within the context of tool, or again on one or another of the mobile unit's surfaces, a fixed or adjustable and thereby trackable orientation or alignment of a given sensor mounted thereon, etc.)”). Regarding Claim 6, the combined references noted above teach the limitations of Claim 5, and wherein the guide tip is configured to contact an index point on the surface associated with the anatomical feature (corresponding disclosure in at least [0040] of Kang, where there is a tip, which contacts a particular point on the feature “wherein the guide tip is configured to contact an index point on the surface associated with the anatomical feature”), wherein the guide tip aids in maintaining a scanning distance between the depth sensor and the surface to be within the desired operating range (corresponding disclosure in at least [0040], where the tip’s location is measured and determined relative to the trackable element, which would assist in maintaining an operating range “the transformation between the coordinate system of the probe 38 and the trackable element 40 can be calculated. In this manner, the surgical controller 10 is able to determine where the tip of registration probe 38 is located relative to the trackable element 40”). Regarding Claim 7, the combined references noted above teach the limitations of Claim 6, and wherein the contact between the index point and the guide tip forms a pivot point such that as the pointer device is moved relative to the anatomical feature around the pivot point, (corresponding disclosure in at least Figure 7 of Charron and [0098] of Charron, where there is an instrument point (guide tip) which makes contact with the cavity (where the index point would form), and depending on the amount of pressure applied and how the instrument is held, the orientation (pivot point) would change or exist “ by mapping each instrument's position and orientation, the tracking system may also generally extrapolate a location and orientation of the instrument's various segments, such as an instrument's tip for example, when located and used within the surgical cavity”). PNG media_image2.png 396 472 media_image2.png Greyscale Figure 7 of Charron and the depth sensor determines a corresponding depth to the surface for that relative orientation and position (corresponding disclosure in at least [0071] of Charron, where there is a tool for determining depth, or 3D space) to generate the surface point cloud of the surface (corresponding disclosure in at least [0029] of Kang, where surface point clouds can be generated “A point cloud is created using the numerous points of contact. The point cloud can then be aligned with the three-dimensional representation of bone 44, which was created preoperatively, to register the bone 44 to the three-dimensional representation”). Regarding Claim 8, the combined references noted above teach the limitations of Claim 7, and Charron further teaches wherein the pivot point is an intermediate reference point used to determine relative orientation and position of the anatomical features and the pointer device (corresponding disclosure in at least [0098], where the position of the instrument (including the pivot point tip) is used in determining the position and orientation of the device “by mapping each instrument's position and orientation, the tracking system may also generally extrapolate a location and orientation of the instrument's various segments, such as an instrument's tip for example, when located and used within the surgical cavity (i.e. down-port location and orientation in the context of a port based procedure)” and further in [0099], where the anatomical features can also be determined “a tracked sensored tool tip may be enhanced via the disposition of one or more cameras (e.g. miniature camera with a micro lens) at the tool tip to provide real-time intraoperative inner-cavity”). Regarding Claim 9, the combined references noted above teach the limitations of Claim 1 and Amanatullah further teaches wherein the depth sensor is selected from one or more of:- a Lidar (light detection and ranging); and- an optical rangefinder (corresponding disclosure in at least [0030], where there are LIDAR sensors and other optical sensors, which generate a 3D image, which would show range “the computer system can interface with a single optical sensor (e.g., an infrared, LIDAR, depth and/or any other optical sensor)… the computer system can access optical scan data from each optical sensor in the array of optical sensors and stitch together the optical scans to generate a three-dimensional (or “3D”) panoramic image of the surgical field”). Regarding Claim 10, the combined references noted above teach the limitations of Claim 21, and Charron further teaches a second camera mounted to the pointer device (corresponding disclosure in at least [0072] and Figure 3, where there is a camera on the device “secondary displays 205, 211 can provide an output of the tracking camera 213, which may include, but is not limited to, axial, sagittal and/or coronal views as part of a multi-view display, for example, and/or other views as may be appropriate, such as views oriented relative to the at least one tracked instrument” and [0079], where there are two cameras “the navigation system 200 may further or alternatively comprise a plurality of wide-field cameras, e.g., two additional wide-field cameras (not shown) being implemented with video overlay information, wherein one camera is mountable in relation to the mobile unit 505 (e.g. overhead camera) and the other camera is mountable in relation to the navigation system”) PNG media_image3.png 476 460 media_image3.png Greyscale Figure 3 of Charron wherein the depth sensor is directed in a direction within a field of view of the second camera (corresponding disclosure in at least Figure 3 where the depth sensor is within the field of view for the camera) and a graphical user interface to display at least part of an image from the second camera (corresponding disclosure in at least [0105] where there is a unit display “a sensor/probe (optical) line-of-sight axis may be adjusted at an angle relative to a unit display (564A, 564B), particularly where a direct line-of-sight imaging sensor/probe and display configuration is not convenient or ergonomically ideal given the application at hand”). Regarding Claim 11, the combined references noted above teach the limitations of Claim 1 and wherein the at least one processing device is further configured to: receive a patient profile of the anatomical feature (corresponding disclosure in at least [0129] of Charron, where a patient profile (MRI image, US image, etc.) is received “generate intraoperative input(s) by way of a variety of imaging devices, including anatomy specific MM devices, surface array Mill scans, endo-nasal MRI devices, anatomy specific ultrasound (US) scans, endo-nasal US scans, anatomy specific computerized tomography (CT) or positron emission tomography”), determining a predicted outline of the anatomical feature based on the patient profile (corresponding disclosure in at least [0087] of Amanatullah, where the system identifies the features of the anatomical features, which would include an outline “the computer system can identify a set (or “constellation”) of real features on and/or near the hard tissue of interest predicted to persist throughout the surgery (i.e., predicted to not be removed from the patient during the surgery)”) and generate a modified image comprising the image from the second camera superimposed with the predicted outline (corresponding disclosure in at least [0087] of Amanatullah, where the system updates the image, or modifies the image based on the prediction “Later, as the hard tissue of interest is modified (e.g., resected, cut connected to a surgical implant) during the surgery, the computer system can transition to registering the virtual patient model to the hard tissue of interest via this set of real, persistent features rather than directly to the real hard tissue of interest”) wherein the graphical user interface displays the modified image to guide a user to direct the depth sensor mounted to the pointer device to surface(s) corresponding to the predicted outline of the anatomical feature (corresponding disclosure in at least [0022] of Amanatullah, where the system there is a display of the image based on the updated contour “the computer system can implement similar methods and techniques to generate virtual reality frames depicting both real patient tissue and virtual content (e.g., target resected contours of tissues of interest defined in a virtual patient model thus registered to the real patient tissue)”). Regarding Claim 12, the combined references noted above teach the limitations of Claim 11, and Kang further teaches wherein the processing device is further configured to:- compare the generated surface point cloud with the patient profile and generate an updated patient profile based on a result of the comparison (corresponding disclosure in at least [0029], where there is an initial patient profile (the representation completed pre-operatively), which is compared and registered into a 3D representation (updated patient profile) “A point cloud is created using the numerous points of contact. The point cloud can then be aligned with the three-dimensional representation of bone 44, which was created preoperatively, to register the bone 44 to the three-dimensional representation”). Regarding claim 13, the combined references noted above teach the limitations of Claim 1, and Charron further teaches wherein the depth sensor and the at least one processing device are part of a mobile communication device (corresponding disclosure in at least [0015], where there is a processing unit with a mobile unit “at least one of the surgical image processing unit or the tracking unit is at least partially implemented by a digital processor of the mobile unit”). Regarding Claim 21, the combined references noted above teach the limitations of Claim 1, and Charron further teaches wherein the depth sensor is mounted to the pointer device (corresponding disclosure in at least [0071], where there are 3D tracking markers (depth sensor) which are on the pointer device (tool) “may be determined by the tracking camera 213 by automated detection of tracking markers 512, 212 placed on these tools, wherein the 3D position and orientation of these tools can be effectively inferred and tracked by tracking software from the respective position of the tracked markers 512, 212”). Claim 2 is rejected under 35 U.S.C. 103 as being unpatentable over Charron (US20190117318A1), Kang (WO2014093480A1), and Amanatullah (US20190231433A1) as applied in Claim 1, and in further view of Zimmermann (US20220125517A1). Regarding Claim 2, the combined references of Charron and Kang teach the limitations of Claim 1, and Charron further teaches one or more pointer markers attached to the pointer device (corresponding disclosure in at least [0071], where there are tracking markers (pointer markers) attached to the device “tracking markers 512, 212 placed on these tools, wherein the 3D position and orientation of these tools can be effectively inferred and tracked by tracking software from the respective position of the tracked markers 512, 212”); wherein to determine relative orientation and position between the anatomical feature and the pointer device, the camera-based tracking system, or the at least one processing device (corresponding disclosure in at least [0071], where there is a tracking system comprising a camera “the tracking system 213 comprises a three-dimensional (3D) optical tracking stereo camera, such as a Northern Digital Imaging® (NDI) optical tracking stereo camera, which can be configured to locate reflective sphere tracking markers 512, 212 in 3D space”), is further configured to: identify the pointer markers and tissue markers in one or more fields of view of the camera-based tracking system (corresponding disclosure in at least [0071], where the camera tracker is used for identification/location of the markers “the tracking system 213 comprises a three-dimensional (3D) optical tracking stereo camera, such as a Northern Digital Imaging® (NDI) optical tracking stereo camera, which can be configured to locate reflective sphere tracking markers 512, 212 in 3D space”); and based on locations of the pointer markers and tissue markers in the field of view, calculate the relative orientation and position between the anatomical feature and the pointer device (corresponding disclosure in at least [0071], where the location data is contained for the markers “location data of the mobile imaging unit 505, access port 206, introducer 210 and its associated pointing tool, and/or other tracked instruments/tools, may be determined by the tracking camera 213 by automated detection of tracking markers 512, 212 placed on these tools, wherein the 3D position and orientation of these tools can be effectively inferred and tracked by tracking software from the respective position of the tracked markers 512, 212”). Charron does not teach one or more tissue markers attached to the anatomical feature. Zimmerman, in a similar field of endeavor, teaches one or more tissue markers attached to the anatomical feature (corresponding disclosure in at least [0085], where there are markers attached to the anatomical feature (patient tibia) “The non-mechanical tracking system is an optical tracking system with a detection device 44 and trackable elements (e.g. navigation markers 46, 47) that are respectively disposed on tracked objects (e.g., patient tibia 10 and femur 11) and are detectable by the detection device 44”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated tissue markers attached to the anatomical feature as taught by Zimmerman. One of the ordinary skill in the art would have been motivated to incorporate this because having markers on the anatomical features ensures detection and exact location of the area of interest and the contour/shape of it. Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Charron (US20190117318A1), Kang (WO2014093480A1), Amanatullah (US20190231433A1), and Zimmermann (US20220125517A1) as applied in Claim 2, and in further view of Lampert (US20210290315A1). Regarding Claim 3, the combined references noted above teach the limitations of Claim 2, and the pointer markers ([0071] of Charron), but does not teach wherein the pointer markers and the tissue markers are ArUco fiducial markers. Lampert, in a similar field of endeavor, teaches a similar concept (surgical navigation systems) of ArUco fiducial markers (corresponding disclosure in at least [0208], where the markers and AruCo markers “Tool tracker 1200 internally includes … and a set of fiducial markers 1205 for tracking by the 3D camera. … for detection and tracking of ArUco markers”). It would have been obvious to a person having ordinary skill in the art before the effective filing date to have incorporated AruCo markers as taught by Lampert. One of the ordinary skill in the art would have been motivated to incorporate this because AruCo markers are commonly used for quick detection for position and orientation using a camera setup. Response to Arguments Applicant's arguments filed 01/16/2026 regarding the Drawing objections have been fully considered and are withdrawn in light of the amendments. Applicant's arguments filed 01/16/2026 regarding the 35 U.S.C. 112b rejections have been fully considered and are withdrawn in light of the amendments. Applicant’s arguments with respect to claim 1 regarding the 35 U.S.C. 103 rejection has been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. All remaining claims are rejected due to their dependency to the independent claim 1. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KAITLYN KIM whose telephone number is (571)272-1821. The examiner can normally be reached Monday-Friday 6-2 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Kozak can be reached at (571) 270-0552. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.E.K./Examiner, Art Unit 3797 /SERKAN AKAR/Primary Examiner, Art Unit 3797
Read full office action

Prosecution Timeline

May 02, 2023
Application Filed
Sep 22, 2025
Non-Final Rejection — §103
Jan 16, 2026
Response Filed
Mar 24, 2026
Final Rejection — §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+65.7%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 12 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month