DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 21-27 are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al (US 2014/0257328) in view of Graham et al (US 2005/0033145) OR Osterhout et al (US 2016/0217327) and further in view of Huntzicker et al (US 2010/0117960).
Claims 1-20 (Cancelled).
Claim 21. Kim teaches a system for capturing images of a target of interest associated with a patient and displaying the captured images to a user, comprising:
a wearable display that is worn by the user and is configured to display to the user a stream of images of the target of interest associated with the patient as user engages the patient;
Kim: an image display unit 21 for displaying an image that may be transferred through an image capturing device, for example, the endoscopic camera 1003, Fig. 1, [0041-0044]. To approach an affected part of a patient, [0047)
Kim do not teach a wearable display.
Graham teaches a surgeon may operate with the device 12 mounted on his/her head, observing tissue perfusion in areas of interest (tissue region 20) to make a decision on whether or not a particular area, for example, needs to be deeply excised or regrafted, [0029] where Fig. 6 shows an isolated view of tissue region 20 that contains one or more skin burns. Tissue region 20 contains burned region 50, moderately burned region 51 and slightly burned region 52 surrounded by a region of unburned skin 53. The skin burns become progressively more serious toward the interior of tissue region 20. The outermost skin 53 is undamaged. The innermost skin region 54 is charred. It should be noted that the device 12 of the present invention may be used to assess tissue burns, including the depth of a burn, [0037]);
Osterhout teaches “A method and apparatus for biometric data capture are provided. The apparatus includes in interactive head-mounted eyepiece worn by a user that includes an optical assembly through which a user views a surrounding environment and displayed content. The optical assembly comprises a corrective element that corrects the user's view of the surrounding environment and an integrated processor for handling content to the user. An integrated optical sensor captures biometric data when the eyepiece is positioned so that a nearby individual is proximate to the eyepiece. Biometric data is captured using the eyepiece and is transmitted to a remote processing facility for interpretation. The remote processing facility interprets the captured biometric data and generates display content based on the interpretation. This display content is delivered to the eyepiece and displayed to the user, Abstract.
a detector mechanically decoupled from the wearable display and positioned at the target of interest associated with the patient and configured to capture an image field of view of the target of interest associated with the patient; and (See previous step above);
a computing device configured to:
adjust the image field of view of the target of interest associated with the patient as captures by the detector, the target of interest associated with the patient as the user engages the target of interest of the patient as displayed to the user by the wearable display, in response to each command provided by the user to adjust the image field of view of the detector thereby enabling the user to control the detector to engage the target of interest associated with the patient, (While Kim teaches an approach to an affected part of a patient, [0047] and a part of the patient's body 6 is incised for surgery, [0057].
Kim does not explicitly teach “adjust a field of view”.
Huntzicker teaches, “FIGS. 10-12 each illustrate different ways a user may adjust the FOV or displayed portion 137 to view different portions of viewable output 130 on display 122. As indicated in FIG. 10, the scale the FOV of displayed portion 137 may be increased (i.e., zooms in) or decreased (i.e., zooms out), [0076-0080];
instruct the wearable display to display to the user the selected portion field of view level as the detector is capturing the imaging field of view as requested by the user as the user engages the target of interest associated with the patient, wherein the selected portion field of view level that is thereby displayed to the user by the wearable display is less than a total field of view of the user, and display a portion field of view to the user based on the selected field of view portion field of view level as the user engages the target of interest associated with the patient via the wearable display, wherein the portion field of view that is thereby displayed to the user by the display is less than the total field of view of the user. (Kim does not teach this feature. Examine further notices that Graham also teaches this feature implicitly when the acquired imaged may provide a large field of view or small field of view, [0041] as shown in Fig. 6 where “depicted is an isolated view of tissue region 20 that contains one or more skin burns. Tissue region 20 contains burned region 50, moderately burned region 51 and slightly burned region 52 surrounded by a region of unburned skin 53. The skin burns become progressively more serious toward the interior of tissue region 20. The outermost skin 53 is undamaged. The innermost skin region 54 is charred. It should be noted that the device 12 of the present invention may be used to assess tissue burns, including the depth of a burn, caused by any source such as thermal, chemical, electrical, UV, biologic, etc. [0037].
In explicit terms, Huntzicker teaches “as only a portion of map 66 is shown at a given time, the area displayed within the FOV will generally be less than the total area of map 66, [0056] and Figs 4 and 5. Also Fig. 8, controller 126 generates a displayed portion 137 of viewable output 130 that is shown on the display 122, [0067] while Figs. 10-12 shows the larger field of view… and the capability to decrease and increase the size, [0076]”
Therefore it would have been obvious to the ordinary artisan before the effective filing date to incorporate the teaching of Graham OR Osterhout into the teaching of Kim for the purpose of utilizing the convenience of using a wearable device for greater convenience where it provide the user a hands-free to perform other tasks; and Huntzicker
Claim 22, further comprising a communication interface where in the communication interface configured to: transfer the stream of images via the communication network to the wearable display simultaneously as the detector captures the stream of images of the target of interest, thereby enabling the user to view the images of the target of interest in real-time as the detector captures the stream of images of the target of interest; and transfer the command via the communication network to the detector simultaneously as the detector captures the stream of images of the target of interest, thereby enabling the user to adjust the field of view of the detector in real-time based on the images of the target of interest as captured by the detector and transferred to the wearable display. (See claim 21).
Claim 23, further comprising a switchable filter arranged between the target of interest and the detector, the switchable filter configured to switch between a first mode in which the switchable filter is in operative communication with the detector for detecting a first frequency spectrum, and a second mode in which the filter is not in operative communication with the detector for detecting a second frequency spectrum. (Graham: Fig. 4, the night vision goggles include one or two eyepieces 22, a focusable lens 24, an optical fluorescence filter 32, an interpupillary adjustment control 28, and diopter adjustment controls 26. The focusable lens 24 may be a single tube objective lens that includes a focusing ring 30 (which may be marked in one embodiment to quickly and accurately focus for a set distance) and the filter 32. FIG. 4 illustrates the filter 32 shown in an "up" position, which allows the night vision goggles to perform as regular night vision goggles, [0030]. Here examiner reads that filter 32 can be switched between the positions, i.e., open or close).
Claim 24, wherein the first mode enables near-infrared (NIR) imaging of the target of interest. (Graham: near-infrared fluorescence imaging, [0035]).
Claim 25, wherein the first mode enables fluorescence imaging of the target of interest, and the first frequency spectrum is selected as compatible with the frequency range of fluorescence of a contrast agent applied to the patient. (Graham: The device may also include means for optically filtering the image to include pixels illuminated at frequencies about a wavelength at which a contrast agent fluoresces or absorbs, [0013] using a near-infrared fluorescence imaging, [0035]).
Claim 26, wherein the detector comprises: a first fluorescence camera and a second fluorescence camera configured to capture a stereoscopic fluorescence view of the target of interest associated with the patient; (Osterhout: two digital cameras or sensors mounted on the eyepiece, [0174]) and a first color camera and a second color camera configured to capture a stereoscopic color view of the target of interest associated with the patient. (Osterhout: the eyepiece may provide 3D display imaging to the user, such as through conveying a stereoscopic, auto-stereoscopic, computer-generated holography, volumetric display image, stereograms/stereoscopes, view-sequential displays, electro-holographic displays, parallax “two view” displays and parallax panoramagrams, re-imaging systems, and the like, creating the perception of 3D depth to the viewer. Display of 3D images to the user may employ different images presented to the user's left and right eyes, such as where the left and right optical paths have some optical component that differentiates the image, where the projector facility is projecting different images to the user's left and right eye's, and the like. The optical path, including from the projector facility through the optical path to the user's eye, may include a graphical display device that forms a visual representation of an object in three physical dimensions. A processor, such as the integrated processor in the eyepiece or one in an external facility, may provide 3D image processing as at least a step in the generation of the 3D image to the user, [0271]).
Claim 27, wherein the computing device is further configured to overlay the stereoscopic fluorescence view onto the stereoscopic color view to generate an overlay that is provided to the wearable display. (Osterhout: the eyepiece may be able to provide an augmented reality by overlaying information onto the user's viewed external environment, such as the overlaid 3D displayed map 1512C, the location information 1514C, and the like, where the displayed map, information, and the like, may change as the user's view changes, [0259]).
Claim 28 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claims 29-40 (Cancelled).
Claims 41-45 (Allowable).
Claims 46-49. These are various ways to express same activity as described in the independent claim. Please see the independent claim 21 for rejection.
Claims 50-51. These are various expression of independent claim 21 and is rejected in the same manner as claim 21.
Response to Arguments
Applicant's arguments filed 11/24/25 have been fully considered but they are not persuasive.
Examiner wishes also to point out first that the rejection is based on a combination several prior art (Kim in view of Graham OR Osterhout further in view of Huntzicker
Apparently, applicant appears to attack reference individually. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981) wherein the Court states "The test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art".
For example, Kim, Graham and Osterhout all teaches detecting/capturing images of a target of interest associated with a patient as a user engages the target of interest.
Instead, applicant emphasized on Huntzicker who again according to applicant “does not disclose a detector that captures images of a target of interest associated with a patient as a user engages the target of interest. Rather, Huntzicker simply discloses displaying maps via display 22 such that the FOV of the displayed maps may be zoomed in or zoomed out to generate the different FOVs of the displayed maps. Nowhere does Huntzicker disclose imaging, adjusting imaging FOVs, and displaying selected portions of FOVs of a target of interest associated with a patient as a user engages the target of interest. Therefore, in simply disclosing a device that displays maps, Huntzicker fails to teach or suggest at least the features of claim 21 of a system configured to "adjust the imaging field of view of the target of interest associated with the patient as captured by the detector to a selected portion field of view level of the target of interest associated with the patient as the user engages the target of interest of the patient as displayed to the user by the wearable display, "instruct the wearable display to display to the user the selected portion field of view level as the detector is capturing the imaging field of view as requested by the user as the user engages the target of interest associated with the patient," or "display a portion field of view to the user based on the selected field of view portion field of view level as the user engages the target of interest associated with the patient via the wearable display.
Examiner respectfully disagrees. Firstly, Huntzicker does not simply “disclosing a device that displays maps”. Huntzicker was introduced because of explicit language regarding “the portion field of view that is thereby displayed to the user by the display is less than the total field of view of the user”. For example as it is already presented in the rejection above, Huntzicker teaches “as only a portion of map 66 is shown at a given time, the area displayed within the FOV will generally be less than the total area of map 66, [0056] and Figs 4 and 5. Also Fig. 8, controller 126 generates a displayed portion 137 of viewable output 130 that is shown on the display 122, [0067] while Figs. 10-12 shows the larger field of view… and the capability to decrease and increase the size, [0076]”. Examiner does not see any counter-analysis from applicant on the provided passages. Please note again that Huntzicker in only one part of the combination not the whole. Examiner will refuse to let applicant attack reference individually based on the conclusion of the Court In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981).
Even assuming arguendo that failed to teach “the portion field of view that is thereby displayed to the user by the display is less than the total field of view of the user”.
Graham at least teaches this feature implicitly when the acquired imaged may provide a large field of view or small field of view, [0041] as shown in Fig. 6 where “depicted is an isolated view of tissue region 20 that contains one or more skin burns. Tissue region 20 contains burned region 50, moderately burned region 51 and slightly burned region 52 surrounded by a region of unburned skin 53. The skin burns become progressively more serious toward the interior of tissue region 20. The outermost skin 53 is undamaged. The innermost skin region 54 is charred. It should be noted that the device 12 of the present invention may be used to assess tissue burns, including the depth of a burn, caused by any source such as thermal, chemical, electrical, UV, biologic, etc. [0037].
Similarly, Osterhout also teaches the feature, “The wearer may be able to increase zoom in the field of view or increase zoom within a partial field of view. In an embodiment, an associated camera may make an image of the object and then present the user with a zoomed picture. A user interface may allow a wearer to point at the area that he wants zoomed, such as with the control techniques described herein, so the image processing can stay on task as opposed to just zooming in on everything in the camera's field of view, [0218]”.
Conclusively and respectfully, examiner sustains the rejection.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHUNG-HOANG J. NGUYEN whose telephone number is (571)270-1949. The examiner can normally be reached Reg. Sched. 6:00-3:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Duc Nguyen can be reached at 571-272-7503. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PHUNG-HOANG J NGUYEN/ Primary Examiner, Art Unit 2691