DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-6 and 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent No. 6,106,464 to Bass et al., in view of U.S. PG Pub. No. 2002/0103432 A1 to Kawchuk, and further in view of in view of U.S. PG Pub. No. 2002/0120192 A1 to Nolte et al.
Regarding claim 1, Bass discloses an apparatus, comprising: (a) a body having a distal end; (b) a location device, the position sensor being fixedly positioned relative to the distal end, the location device being configured to indicate a real-time position of the location device within three- dimensional space; and (c) an ultrasonic assembly, the ultrasonic assembly being fixedly positioned relative to the location device, the ultrasonic assembly comprising a transducer operable to generate ultrasonic waves, the distal end being configured to emit the ultrasonic waves generated by the transducer (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Bass discloses an optical tracking system and not an electric or magnetic sensor positioning system.
However, Kawchuk discloses surgical assistance device, comprising (b) a position sensor, the position sensor being fixedly positioned relative to the distal end, the position sensor being configured to generate a signal indicating a real-time position of the position sensor within three- dimensional space (see Fig. 2 and para 61). Examiner notes that reversing the sensor’s location or the emitter’s location on the device is well known in the art.
It would have been obvious and predictable to have substituted the optical sensing system of Bass for the electric or magnetic systems of Kawchuk because doing so would achieve the same predictable result, namely tracking the elements in 3D space. Examiner notes that flip a sensor for an emitter would have been an obvious matter of design because doing so would predictably provide 3D tracking.
Bass appears to discloses a device, wherein the ultrasonic assembly being configured to generate ultrasonic images (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
However, in the alternative, generating ultrasound images is well known and obvious for showing tissue structures beneath the skin for diagnostic and tracking purposes.
It would have been obvious and predictable to have also included ultrasound imaging because doing so would allow a user to visualize the bone and other tissues tracked in Bass.
Bass in combination with Nolte disclose a similar surgical system further comprising a processor configured to register a patient with one or more preoperative images based at least in part on one or more ultrasonic images generated by the ultrasonic assembly and the signal indicating a real-time position of the position sensor within three-dimensional space (see Bass Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20 and Nolte Fig. 8 and para 16-19, 30-31, and 34-41).
It would have been obvious and predictable to have combined the teachings of Bass and Nolte because doing so would ensure that the actual bone surface is registered in the 3D and not the just position of the position sensor. Such an arrangement would increase accuracy of registration of the bone in the ultrasound image, pre-operative images, and the mechanically scanned data. Examiner notes that registering multiple modalities of data allows a surgeon to have a more complex and information rich surgical display.
Regarding claim 2, Bass discloses a device, wherien the ultrasonic assembly being configured to determine a depth of soft tissue between bone in a patient and an outer surface of the patient in contact with the distal end (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Regarding claim 3, Nolte discloses a similar surgical tracking system, further comprising a processor configured to determine a distance between the position sensor and the bone in the patient based at least in part on a combination of a known distance between the position sensor and the distal end and the depth of soft tissue determined by the ultrasonic assembly (see Fig. 1 and para 16-19, 30-31, and 34-41).
It would have been obvious and predictable to have combined the teachings of Bass and Nolte because doing so would ensure that the actual bone surface is registered in the 3D and not the just position of the position sensor. Such an arrangement would increase accuracy of registration of the bone in the ultrasound image, pre-operative images, and the mechanically scanned data.
Regarding claim 4, Bass discloses, further comprising an image guided surgery system external to the body, the processor being contained in the image guided surgery system (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Regarding claim 5, Kawchuk discloses a similar surgical system, the image guided surgery system further comprising a field generator assembly, the field generator assembly being operable to generate an electromagnetic field, the position sensor being configured to generate a signal indicating a real- time position of the position sensor within three-dimensional space in response to the electromagnetic field (see Fig. 2 and para 61). Examiner notes that reversing the sensor’s location or the emitter’s location on the device is well known in the art.
It would have been obvious and predictable to have substituted the optical sensing system of Bass for the electric or magnetic systems of Kawchuk because doing so would achieve the same predictable result, namely tracking the elements in 3D space. Examiner notes that flip a sensor for an emitter would have been an obvious matter of design because doing so would predictably provide 3D tracking.
Regarding claim 6, Bass in combination with Nolte disclose a similar surgical system, the processor being further configured to register a patient with one or more preoperative images based at least in part on the determined distance between the position sensor and the bone in the patient and the signal indicating a real-time position of the position sensor within three-dimensional space (see Bass Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20 and Nolte Fig. 1 and para 16-19, 30-31, and 34-41).
It would have been obvious and predictable to have combined the teachings of Bass and Nolte because doing so would ensure that the actual bone surface is registered in the 3D and not the just position of the position sensor. Such an arrangement would increase accuracy of registration of the bone in the ultrasound image, pre-operative images, and the mechanically scanned data. Examiner notes that registering multiple modalities of data allows a surgeon to have a more complex and information rich surgical display.
Regarding claims 9 and 10, Bass discloses a device, wherein the processor being further configured to identify structural features of bone in the one or more ultrasonic images generated by the ultrasonic assembly through one or more image processing algorithms; and the processor being further configured to correlate the structural features of bone identified from the one or more ultrasonic images with corresponding structural features of bone in one or more preoperative images (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Regarding claim 11, Bass discloses a device, the processor being further configured to map a plurality of registration points from bone in the one or more ultrasonic images with corresponding points in the one or more preoperative images (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Claims 12-15 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent No. 6,106,464 to Bass et al., in view of U.S. PG Pub. No. 2002/0103432 A1 to Kawchuk, and further in view of in view of U.S. PG Pub. No. 2007/0270690 A1 to Woerlein.
Regarding claim 12, Bass discloses an apparatus, comprising: (a) a body having a distal end; (b) a location device, the location device being fixedly positioned relative to the distal end by a first distance, the location device being configured to indicate a real-time position of the location device within three-dimensional space; and (c) a depth-finding module, the depth-finding module being fixedly positioned relative to the location device, the depth-finding module being operable to determine a second distance representing a real-time depth of soft tissue between bone in a patient and a contact point between an outer surface of the patient and the distal end (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Bass discloses an optical tracking system and not an electric or magnetic sensor positioning system.
However, Kawchuk discloses surgical assistance device, comprising (b) a position sensor, the position sensor being fixedly positioned relative to the distal end, the position sensor being configured to generate a signal indicating a real-time position of the position sensor within three- dimensional space (see Fig. 2 and para 61). Examiner notes that reversing the sensor’s location or the emitter’s location on the device is well known in the art.
It would have been obvious and predictable to have substituted the optical sensing system of Bass for the electric or magnetic systems of Kawchuk because doing so would achieve the same predictable result, namely tracking the elements in 3D space. Examiner notes that flipping a sensor for an emitter would have been an obvious matter of design because doing so would predictably provide 3D tracking.
Woerlein discloses a similar tracking and range finding system in combination with Bassw and Kawchuk, further comprising a patient tracking assembly, the patient tracking assembly being fixedly positioned to the head of a patient, the patient tracking assembly including a second position sensor configured to generate a signal indicating a real time position of the patient tracking assembly within three-dimensional space (see Figs. 1 and 2 and para 25-29).
It would have been obvious and predictable to use a second position sensor on the patient’s head because doing so would provide a starting point and frame of reference for the tracking and rang finding of the location device.
Regarding claim 13, Nolte discloses a similar surgical system, further comprising a processor, the processor being configured to determine a real-time distance from the position sensor to the bone in the patient underlying the contact point based on a combination of the first and second distances (see Fig. 8 and para 16-19, 30-31, and 34-41).
It would have been obvious and predictable to have combined the teachings of Bass and Nolte because doing so would ensure that the actual bone surface is registered in the 3D and not the just position of the position sensor. Such an arrangement would increase accuracy of registration of the bone in the ultrasound image, pre-operative images, and the mechanically scanned data.
Regarding claim 14, Bass in combination with Nolte disclose a similar surgical system, the processor being further configured to register a patient with one or more preoperative images based at least in part on the determined real-time distance and the signal indicating a real-time position of the position sensor within three- dimensional space (see Bass Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20 and Fig. 8 and para 16-19, 30-31, and 34-41).
It would have been obvious and predictable to have combined the teachings of Bass and Nolte because doing so would ensure that the actual bone surface is registered in the 3D and not the just position of the position sensor. Such an arrangement would increase accuracy of registration of the bone in the ultrasound image, pre-operative images, and the mechanically scanned data. Examiner notes that registering multiple modalities of data allows a surgeon to have a more complex and information rich surgical display.
Regarding claim 15, Bass discloses a device, wherein the depth finding module being operable to: (i) emit ultrasonic energy through tissue between the outer surface of the patient and the bone in the patient, and (ii) receive ultrasonic energy reflected by the bone in the patient to thereby determine the second distance (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Claims 16-18 are rejected under 35 U.S.C. 103 as being unpatentable over Bass, Kawchuk, and Woerlin as applied to claim 12 above, and further in view of in view of either or both of U.S. PG Pub. No. 2016/0361070 A1 to Ardel and U.S. Patent No. 5,871,445 A to Bucholz.
Regarding claims 16-18, Ardel and Bucholz discloses similar surgical range finders, wherein the depth finding module being operable to: (i) emit light through tissue between the outer surface of the patient and the bone in the patient, and (ii) receive light reflected by the bone in the patient to thereby determine the second distance; the emitted light being within the infrared spectrum; and the emitted light comprising laser light (see Ardel Figs. 5 and 8 and para 240 and Bucholz Fig. 8 and col 15 ln 65-col 16 ln 34).
It would have been an obvious and predictable substitute to use infrared lasers for the ultrasound of Bass because doing so would predictably provide the same measurements as already required in Bass.
Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Bass, Kawchuk, and Woerlin as applied to claim 12 above, and further in view of in view of U.S. PG Pub. No. 2008/0039718 A1 to Drinan et al.
Regarding claim 19, Drinan discloses a similar depth finding tissue imaging device, wherien the depth finding module being operable to determine the second distance via ultra-wideband radar (see Figs. 3-5 and para 45-58).
It would have been an obvious and predictable substitute to use UWB radar for the ultrasound of Bass because doing so would predictably provide the same measurements with a small number of inexpensive components enabling low power, portable applications.
Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent No. 6,106,464 to Bass et al., in view of U.S. PG Pub. No. 2002/0103432 A1 to Kawchuk, and further in view of in view of either or both of U.S. PG Pub. No. 2016/0361070 A1 to Ardel and U.S. Patent No. 5,871,445 A to Bucholz.
Regarding claim 20, Bass discloses a method, comprising: (a) positioning a registration probe against an outer surface of a patient at a first contact point; (b) capturing position data via a location device on the registration probe while the registration probe is positioned against the outer surface of the patient at the first contact point, the position data indicating a real-time position of the location device within three-dimensional space; (c) emitting energy through soft tissue underlying the outer surface of the patient, the emitted energy being reflected by bone under the soft tissue; and (d) capturing data from the reflected energy, the captured data representing one or both of: (i) a depth of soft tissue between the first contact point and the bone, or (ii) an image of the bone (see Figs. 1-3, abstract, col 2 ln 41- col 3 ln 61, and col 4 ln 27-col 7 ln 20).
Bass discloses an optical tracking system and not an electric or magnetic sensor positioning system.
However, Kawchuk discloses surgical assistance method, comprising (b) a position sensor, the position sensor being fixedly positioned relative to the distal end, the position sensor being configured to generate a signal indicating a real-time position of the position sensor within three- dimensional space (see Fig. 2 and para 61). Examiner notes that reversing the sensor’s location or the emitter’s location on the device is well known in the art.
It would have been obvious and predictable to have substituted the optical sensing system of Bass for the electric or magnetic systems of Kawchuk because doing so would achieve the same predictable result, namely tracking the elements in 3D space. Examiner notes that flip a sensor for an emitter would have been an obvious matter of design because doing so would predictably provide 3D tracking.
Ardel and Bucholz discloses similar surgical range finders, wherein the depth finding module being operable to: (i) emit light through tissue between the outer surface of the patient and the bone in the patient, and (ii) receive light reflected by the bone in the patient to thereby determine the second distance; the emitted light being within the infrared spectrum; and the emitted light comprising laser light (see Ardel Figs. 5 and 8 and para 240 and Bucholz Fig. 8 and col 15 ln 65-col 16 ln 34).
It would have been an obvious and predictable substitute to use infrared lasers for the ultrasound of Bass because doing so would predictably provide the same measurements as already required in Bass.
Response to Arguments
Applicant's arguments filed August 13, 2025 have been fully considered but they are not persuasive.
Applicant contends without evidence that A-mode ultrasound imaging is not an image, and therefore, Bass and Nolte do not disclose an image.
Examiner disagrees with Applicant’s opinion because A-mode ultrasound is considered a one-dimensional image in the art.
Further and contrary to Applicant’s contentions, Bass and Nolte disclose images displayed to a user. Bass discloses that the one-dimensional A-mode data is combined with tracking data to create a 3-D model that is displayed to a user (see col 5 ln 9-22). Said display of data reads on the image claimed. Similarly, paragraph 41 of Nolte discloses that the ultrasound data is displayed to a user, which indicates that the depth-based ultrasound of Nolte is also an image.
Applicant next argues that paragraph 8 of Nolte teaches against using an image.
Examiner disagrees because paragraph merely discusses difficulties with some types of images and not imaging in general. Rather, the teachings of Nolte as a whole indicate that A-mode imaging is beneficial.
Applicant argues without rationale or evidence that no reason to combine the references has been made by the Office.
Examiner disagrees because the prior action indicates a reason to combine the cited references.
Turning to claim 12, Applicant argues without rationale or evidence that the prior art does not teach all of the limitations.
Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references.
Applicant argues without rationale or evidence that no reason to combine the references has been made by the Office.
Examiner disagrees because the prior action indicates a reason to combine the cited references.
Turning to claim 20, Applicant contends that claim 20 has similar subject matter to claim 16. First, Examiner notes that claim 16 recites only intended use limitations and is different in scope than claim 20. Applicant also contends that Bucholz and Ardel do not disclose a rangefinder that reflects light off of a bone.
Examiner disagrees because a skilled artisan would have recognized that the forehead mentioned in Bucholz is not the skin but rather the underlying bone. Otherwise, Bucholz would have noted the light would have reflected off of the skin. Further, Ardel at para 139 and 215 clarifies that light range finding is based on energy reflection from the bone. Lastly, in combination with the other cited references a skilled artisan would have recognized that range finding down to the bone level is beneficial and predictable for implementing the method of Bass.
Applicant argues without rationale or evidence that no reason to combine the references has been made by the Office.
Examiner disagrees because the prior action indicates a reason to combine the cited references.
All unchallenged uses of Official Notice have been deemed Applicant Admitted Prior Art.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RAJEEV P SIRIPURAPU whose telephone number is (571)270-3085. The examiner can normally be reached 9-5 M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor is Keith Raymond. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RAJEEV P SIRIPURAPU/Primary Examiner, Art Unit 3798