DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application is being examined under the pre-AIA first to invent provisions.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/10/2025 has been entered.
Response to Amendment
This action is in response to the remarks filed on 10/10/2025. Claims 2-8, 10, 21, 23, 25 and 26 were pending and claims 31-45 are newly added. Claims 11-20, 22, 24, and 27-30 are cancelled.
Drawings
The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, the “each LED is connected to a respective heat sink” as claimed in claims 33 must be shown or the feature(s) canceled from the claim(s). No new matter should be entered.
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 2 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 recites the limitation of “a wound site” in line 6, which is not clear if that is the same or different “wound site” mentioned in line 2.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 2-10, 21, 23, 25, 26 and 31-45 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim 2, 37 and 41 recites “determine intensities” and “calculate data”
The limitation of “determine intensities”, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. That is, other than reciting “by a processor,” nothing in the claim element precludes the step from practically being performed in the mind. For example, but for the “by a processor” language, “determine intensities” in the context of this claim encompasses the user manually calculating intensities. Similarly, the limitation of “calculate data”, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components. For example, but for the “by a processor” language, “calculate data” in the context of this claim encompasses the user thinking that the … If a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of generic computer components, then it falls within the “Mental Processes” grouping of abstract ideas. Accordingly, the claim recites an abstract idea.
This judicial exception is not integrated into a practical application. In particular, the claim only recites one additional element – using a processor to perform the limitation of “determine intensities” and “calculate data”. The processor in both steps is recited at a high-level of generality (i.e., as a generic processor performing a generic computer function of “determine intensities” and “calculate data” such that it amounts no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. The claim is directed to an abstract idea.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a processor to perform “determine intensities” and “calculate data” steps amount to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. The claim is not patent eligible.
The depending claims also recite similar abstract ideas (e.g., determine area, track changes etc.) without additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application.
Therefore, the claims are not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a).
Claims 2-10, 21, 23, 25, 26, 31-32, 34, 36-38, 40-43 and 45 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Chhibber et al (US20070064985A1) in view of Olsen (WO2007005714A2, below citations are from the equivalent of US20080029708A1).
Regarding claim 2, Chhibber teaches a handheld fluorescence-based imaging system for obtaining diagnostic data regarding a wound site (see re-produced fig. 2A below; and “subject 101, or part of it, that is captured in the images include both skin and non-skin portions or features, such as hair, clothing, eyes, lips, nostrils, etc. Furthermore, some of the objects surrounding the subject 101 may also be captured in the images. Therefore, the pixels in the first white-light and UV images” [0047]; “camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]), the system comprising:
PNG
media_image1.png
457
619
media_image1.png
Greyscale
an excitation light source configured to emit excitation light including at least one wavelength or wavelength band configured to cause at least one biomarker associated with bacteria at a wound site to fluoresce (“captured by the camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]; “the UV light sources 120 are turned on to send a flash of UV light to the subject 101. The flash of UV light should include a band of UV wavelengths the can causes the skin associated with the subject 101 to fluoresce” [0050]; “captured fluorescence coming from an inflamed pore upon illumination by a UV flash” [0064]);
an optical sensor configured to detect fluorescence emissions from the at least one biomarker at the wound site (“an image acquisition device 110, at least one light source 120 coupled to the image acquisition device 110” [0037]; “first UV image is captured by the sensor 114” [0050]);
a display (“skin condition can be displayed on a user interface” [0012]); and
a processor (e.g., computing device, microprocessor [0013] etc.) configured to:
receive bacterial fluorescence data based on the detected fluorescence emissions from the at least one biomarker (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber), and
determine fluorescence intensities for one or more regions of interest of the wound site based on the bacterial autofluorescence data (“a color value and an intensity value associated with each of the skin pixels in the UV image” [0011]),
based on the fluorescence intensities, calculate data indicative of a biodistribution of bacteria at the wound site (“ intensity values are computed from the pixel values associated with each pixel in one of the UV images” [0063]; “an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore” [0064]).
Although displays having touchscreen functionality (i.e., touchscreen camera) is obvious, widely well-known and commercially available, Chhibber does not mention the digital camera 110, (or 200) having touchscreen functionality.
However, in the same field of endeavor, Olsen teaches digital camera with integrated ultraviolet (UV) response (title). The camera channels provide full color, single color, multi-color or mono chromatic (black and white) capability in any wavelength range from ultraviolet (UV) to infrared (IR). Color filters, if desired, may be on an image sensor or within the optical component layer or a combination of both. The camera channels may also provide color capability by utilizing the semiconductor absorption properties in a pixel [0040]. Digital camera apparatus that includes one or more input devices [0193]. Input devices include but are not limited to buttons, knobs, switches, keyboards, keypads, track ball, mouse, pen and tablet, light pen, touch screens [0847].
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with touchscreen as taught by because it helps to improved image quality ([0012] of Olsen).
Regarding claims 3 and 43, Chhibber teaches wherein the processor is further configured to generate one or more color-coded image maps of the fluorescence intensities (“each pixel in the white-light and UV image includes values associated with the three color channels, which are referred to sometimes in this document as pixel values. The pixel values may range, for example, between 0 and 255.” [0045]; “an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore.” [0064]).
Regarding claim 4, Chhibber teaches wherein the processor is further configured to map the data indicative of the biodistribution of bacteria at the wound site to a representation of the wound site to provide a visual indication of the biodistribution of the bacteria at the wound site (“first skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore.” [0064]).
Regarding claim 5, Chhibber teaches wherein the processor is further configured to determine a fluorescent area of the wound site based on the bacterial autofluorescence (“bacterial fluorescence captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore.” [0064])).
Regarding claims 6 and 7, Chhibber teaches wherein the processor is further configured to display the track changes to the fluorescent area of the wound site over time (“display the skin analysis result in a timeline showing changes of selected skin analysis results over time for the same subject 101” [0072]).
Regarding claims 8 and 9, Chhibber teaches wherein the processor is configured to generate, record, and output one or more reports based on the bacterial fluorescence data include the identification of wound parameters and the tracking of the wound parameters over time (“display selected skin analysis result as compared with previous results related to the same skin condition for the same subject 101.” [0072]).
Regarding claim 10, Chhibber teaches further comprising a white light source configured to emit white light for white light imaging of the wound site (“light sources 120 are configured to illuminate the subject 101 with white light, and another portion of the light sources 120 are configured to emit ultraviolet (UV) light.” [0039]) the optical sensor (“the image acquisition device 110 is part of a digital camera 200 having an image sensor 112 and an optical assembly 114 in front of the image sensor 112 and configured to” [0038]) configured to detect light reflected by the wound site and the processor is further configured to receive reflection data based on the detected light reflected by the wound site (“an image acquisition device 110, at least one light source 120 coupled to the image acquisition device 110” [0037]; “first UV image is captured by the sensor 114” [0050]).
Regarding claim 21, Chhibber teaches the processor is further configured to identify an indication of at least one of wound infection, wound healing, and wound healing failure based at least in part on the bacterial fluorescence data (“display selected skin analysis result as compared with previous results related to the same skin condition for the same subject 101” [0072]).
Regarding claim 23, Chhibber teaches wherein the representation of the wound site comprises a fluorescence image of the wound site (“the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash” [0064]).
Regarding claims 25 and 40, Chhibber teaches further comprising a filter configured to prevent the inclusion of reflected excitation light in the determination of fluorescence intensities for the one or more regions of interest of the wound site (“at least a portion of the flash light sources have UV transmission filters installed thereon, and at least a portion of the flash light sources have infrared absorption filters installed thereon” claim 40 of Chhibber).
Regarding claim 26, Chhibber teaches further comprising a plurality of light sources configured to elicit absorption data from tissue components of the wound (“living organisms fluoresce upon excitation through the absorption of light, a phenomenon known as autofluorescence, it has been shown that different organisms can be classified through their Stokes shift values. Stokes shift is the difference between peak wavelength or frequency of an absorption spectrum and peak wavelength or frequency of an emission spectrum” [0051]).
Regarding claim 31, Chhibber teaches wherein the processor is further configured to output measurements related to a shape or a size of the wound site (“pixels are also examined to determine the size and shape of a skin area having the skin condition” [0063]).
Regarding claim 32, Chhibber teaches wherein:
the excitation light source is a first excitation light source of a plurality of excitation light sources (e.g., “light sources 120”), each excitation light source of the plurality of excitation light sources is configured to emit excitation light including at least one wavelength or wavelength band (“ light sources 120 are configured to illuminate the subject 101 with white light, and another portion of the light sources 120 are configured to emit ultraviolet (UV) light” [0039]) configured to cause at least one biomarker associated with bacteria at the wound site to fluoresce (“a band of UV wavelengths the can causes the skin associated with the subject 101 to fluoresce,” [0050]), and the optical sensor is centrally positioned relative to the plurality of light sources (“ digital camera 200 having an image sensor 112 and an optical assembly 114 in front of the image sensor 112 and configured to form an image of the subject 101 on the image sensor 114” [0038]).
Regarding claims 34 and 38, Chhibber teaches wherein the processor is configured to output an image of raw fluorescence detected by the optical sensor to the display (“red-green-blue (RGB) color space, pixels that have the red channel (channel 1) values in the range of 105-255, the green channel (channel 2) values in the range of 52-191, and the blue channel (channel 3)” [0054]; also see [0060]-[0061], [0067]).
Regarding claim 37, Chhibber teaches a handheld fluorescence-based imaging system for obtaining diagnostic data regarding a wound site (see re-produced fig. 2A below; and “subject 101, or part of it, that is captured in the images include both skin and non-skin portions or features, such as hair, clothing, eyes, lips, nostrils, etc. Furthermore, some of the objects surrounding the subject 101 may also be captured in the images. Therefore, the pixels in the first white-light and UV images” [0047]; “camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]), the system comprising:
PNG
media_image1.png
457
619
media_image1.png
Greyscale
first and second excitation light sources, each of the first and second excitation light sources being configured to emit excitation light selected to elicit emission of bacterial autofluorescence from bacteria at a wound site, wherein each of the first and second excitation light sources is configured to emit excitation light having a wavelength of 405 nm±10nm (“captured by the camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]; “the UV light sources 120 are turned on to send a flash of UV light to the subject 101. The flash of UV light should include a band of UV wavelengths the can causes the skin associated with the subject 101 to fluoresce” [0050]; “captured fluorescence coming from an inflamed pore upon illumination by a UV flash” [0064]);
a white light source configured to illuminate the wound site with white light (“camera 200 are white light sources and the light sources 120” [0041]);
an optical sensor positioned between the first and second excitation light sources, the optical sensor configured to (“an image acquisition device 110, at least one light source 120 coupled to the image acquisition device 110” [0037]; “first UV image is captured by the sensor 114” [0050]; also see fig. 2 e.g.);
detect the bacterial autofluorescence emissions from the wound site, and detect signals reflected from the wound site in response to illumination of the wound site with the white light emitted by the white light source (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber);
a display (“skin condition can be displayed on a user interface” [0012]); and
a processor (e.g., computing device, microprocessor [0013] etc.) configured to:
receive reflection data based on the detected reflected signals and output a white light image of the wound site based on the reflection data, receive bacterial autofluorescence data based on the detected autofluorescence emissions, (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber), and
determine fluorescence intensities for one or more regions of interest of the wound site based on the bacterial autofluorescence data (“a color value and an intensity value associated with each of the skin pixels in the UV image” [0011]),
based on the fluorescence intensities, calculate data indicative of a biodistribution of bacteria at the wound site (“ intensity values are computed from the pixel values associated with each pixel in one of the UV images” [0063]; “an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore” [0064]);
output to the display a color image indicative of the biodistribution of bacteria at the wound site (see e.g., figs 2, 6, 10 and 15).
Although displays having touchscreen functionality (i.e., touchscreen camera) is obvious, widely well-known and commercially available, Chhibber does not mention the digital camera 110, (or 200) having touchscreen functionality.
However, in the same field of endeavor, Olsen teaches digital camera with integrated ultraviolet (UV) response (title). The camera channels provide full color, single color, multi-color or mono chromatic (black and white) capability in any wavelength range from ultraviolet (UV) to infrared (IR). Color filters, if desired, may be on an image sensor or within the optical component layer or a combination of both. The camera channels may also provide color capability by utilizing the semiconductor absorption properties in a pixel [0040]. Digital camera apparatus that includes one or more input devices [0193]. Input devices include but are not limited to buttons, knobs, switches, keyboards, keypads, track ball, mouse, pen and tablet, light pen, touch screens [0847].
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with touchscreen as taught by because it helps to improved image quality ([0012] of Olsen).
Regarding claim 41, Chhibber teaches a handheld fluorescence-based imaging system for acquiring data regarding a wound site (see re-produced fig. 2A below; and “subject 101, or part of it, that is captured in the images include both skin and non-skin portions or features, such as hair, clothing, eyes, lips, nostrils, etc. Furthermore, some of the objects surrounding the subject 101 may also be captured in the images. Therefore, the pixels in the first white-light and UV images” [0047]; “camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]), the system comprising:
a housing configured to be held in a user's hand (see e.g., camera below in figs and pertinent part of the description);
PNG
media_image1.png
457
619
media_image1.png
Greyscale
an excitation light source connected to the housing the excitation light source being configured to emit excitation light selected to elicit emission of bacterial autofluorescence from bacteria at a wound site, wherein the excitation light source is configured to emit excitation light having a wavelength of 405 nm ± 10 nm; (“captured by the camera and that the subject is allowed to reach full fluorescence under UV illumination” [0049]; “the UV light sources 120 are turned on to send a flash of UV light to the subject 101. The flash of UV light should include a band of UV wavelengths the can causes the skin associated with the subject 101 to fluoresce” [0050]; “captured fluorescence coming from an inflamed pore upon illumination by a UV flash” [0064]);
an optical sensor contained in the housing and configured to detect the bacterial autofluorescence emissions from the wound site (“an image acquisition device 110, at least one light source 120 coupled to the image acquisition device 110” [0037]; “first UV image is captured by the sensor 114” [0050]; also see fig. 2 e.g.);
detect the bacterial autofluorescence emissions from the wound site, and detect signals reflected from the wound site in response to illumination of the wound site with the white light emitted by the white light source (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber);
a processor (e.g., computing device, microprocessor [0013] etc.) configured to:
receive bacterial autofluorescence data based on the detected autofluorescence emissions, (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber), and
determine fluorescence intensities for one or more regions of interest of the wound site based on the bacterial autofluorescence data, (“a color value and an intensity value associated with each of the skin pixels in the UV image” [0011]),
based on the fluorescence intensities, calculate data indicative of a biodistribution of bacteria at the wound site (“ intensity values are computed from the pixel values associated with each pixel in one of the UV images” [0063]; “an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore, and an average of the intensity value associated with the number of skin pixels is computed as a quantitative indication of the severity of the pore” [0064]);
a filter configured to prevent the inclusion of reflected excitation light in the determination of fluorescence intensities for the one or more regions of interest of the wound site (“at least a portion of the flash light sources have UV transmission filters installed thereon, and at least a portion of the flash light sources have infrared absorption filters installed thereon” claim 40 of Chhibber); and
a display connected to the housing and configured to receive the output from the processor (see e.g., figs 2, 6, 10 and 15).
Although displays having a power source contained in the housing and touchscreen functionality (i.e., touchscreen camera) is obvious, widely well-known and commercially available, Chhibber does not mention the digital camera 110, (or 200) having touchscreen functionality nor a power source contained in the housing.
However, in the same field of endeavor, Olsen teaches digital camera with integrated ultraviolet (UV) response (title). The camera channels provide full color, single color, multi-color or mono chromatic (black and white) capability in any wavelength range from ultraviolet (UV) to infrared (IR). Color filters, if desired, may be on an image sensor or within the optical component layer or a combination of both. The camera channels may also provide color capability by utilizing the semiconductor absorption properties in a pixel [0040]. Digital camera apparatus that includes one or more input devices [0193]. Input devices include but are not limited to buttons, knobs, switches, keyboards, keypads, track ball, mouse, pen and tablet, light pen, touch screens [0847]. FIG. 2 shows an example of a digital camera 2, and components thereof. The digital camera includes a digital camera subsystem 200, a circuit board 110, setting controls and/or one or more additional input devices etc) 120, a power supply 130 [0324].
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with touchscreen as taught by because it helps to improved image quality ([0012] of Olsen).
Regarding claim 42, Chhibber teaches wherein the system is configured to display one or more of optical wavelength information, fluorescence intensity information, and reflectance intensity information with spatial dimensions of the imaged wound site (see e.g., fig. 9 and the associated pars.).
Regarding claims 36 and 45, Chhibber teaches wherein the excitation light source includes a first light and a second light, and wherein at least one of the first and second is configured to emit blue excitation light (“white-light and UV images includes values associated with three colors” [0009]; “Having more than one flash light sources 120 allows more uniform exposure of the subject 101 to light during imaging and to allow different light sources to be configured to emit different colors or wavelengths of light,” [0039]; “ the first white-light image being in a first color space, such as the red-green-blue (RGB) color space, pixels that have the red channel” [0054]).
Although light sources as light emitting diode (LED) are obvious, widely well-known and commercially available, Chhibber does not mention plurality of light sources 120 as parts of the digital camera 200 being LEDs.
However, in the same field of endeavor, Olsen teaches digital camera with integrated ultraviolet (UV) response (title). The camera channels provide full color, single color, multi-color or mono chromatic (black and white) capability in any wavelength range from ultraviolet (UV) to infrared (IR). Color filters, if desired, may be on an image sensor or within the optical component layer or a combination of both. The camera channels may also provide color capability by utilizing the semiconductor absorption properties in a pixel [0040]. Digital camera apparatus that includes one or more input devices [0193]. Input devices include but are not limited to buttons, knobs, switches, keyboards, keypads, track ball, mouse, pen and tablet, light pen, touch screens [0847]. FIG. 2 shows an example of a digital camera 2, and components thereof. The digital camera includes a digital camera subsystem 200, a circuit board 110, setting controls and/or one or more additional input devices etc) 120, a power supply 130 [0324]. one or more illumination devices (e.g., one or more light emitting diodes (LEDs) with high output intensity [0613]. the one or more illumination devices are in the form of one or more LED's (e.g., one or more high power LED's) [0617].
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with LED as taught by because it helps to improved image quality ([0012] of Olsen).
Claim 33 is rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Chhibber in view of Olson and further in view of Piepgras (US 20060262544).
Regarding claim 33, the above noted combination teaches all the limitations of the claim except for LED is connected to a heat sink.
However, in the same field of endeavor, Piepgras teaches LED refers to light emitting diodes of all types (including semi-conductor and organic light emitting diodes) that may be configured to generate radiation in one or more of the infrared spectrum, ultraviolet spectrum, and various portions of the visible spectrum (generally including radiation wavelengths from approximately 400 nanometers to approximately 700 nanometers). Some examples of LEDs include, but are not limited to, various types of infrared LEDs, ultraviolet LEDs, red LEDs, blue LEDs, green LEDs, yellow LEDs, amber LEDs, orange LEDs, and white LEDs [0030]. "spectrum" refers to frequencies (or wavelengths) not only in the visible range, but also frequencies (or wavelengths) in the infrared, ultraviolet, and other areas of the overall electromagnetic spectrum [0035]. Monitor biological characteristics (e.g., acidity, a presence of one or more particular chemical or biological agents, bacteria, etc.) [0152]. An LED assembly 338-3 (the backside of which is visible in FIG. 35) is thermally coupled to the heat sink 540 (e.g., with a gap pad, viscous paste or liquid metal). The heat sink 540 has fins 510-2 which form channels 542 through which air flows. The LED assembly 338-3 and a chassis 336-3 for supporting secondary optic components 334-2 may be removably attached to the heat sink 540, for example with screws. In some embodiments, the LED assembly 338-3 and the chassis 336-3 may be permanently attached to the heat sink 540 and the entire light-generating module 300-6 incorporating all of the components illustrated in FIG. 35 may be attachable to and removable from lighting fixture housings by a user. The heat sink 540 also may serve as a housing or a support for additional components, electronic or otherwise, for the light-generating module 300-6 [0213].
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with LED having heatsink as taught by Piepgras because such lighting apparatus and methods employ LED-based light sources to provide visible light in a variety of environments and for a variety of lighting applications ([0007] of Piepgras).
Claims 35, 39 and 44 are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Chhibber in view of Olson and further in view of Aprahamian (US6393315B1).
Regarding claims 35, 39 and 44, the above noted combination teaches all the limitations of the claim except for mapping bacterial autofluorescence data and reflectance data onto white light images of the wound site.
However, in the same field of endeavor, Aprahamian teaches detecting and mapping inflamed zones in living tissues. The method includes subjecting the tissues to be analyzed to a luminous excitation with a predetermined spectral domain, acquiring at least the raw fluorescence signal of the porphyrins for a plurality of measurement points, and in determining, for each measurement point, the intensity of the fluorescence for the wavelengths characteristic of endogenous porphyrins (abst). The superposition of this latter image on a normal photographic image permits precisely locating the inflammation zone (col. 6 lines 15-18).
It would have been obvious to an ordinary skilled in the art before the invention was made to modify the method and/or device of the modified combination of reference(s) as outlined above with mapping bacterial autofluorescence data and reflectance data onto white light images of the wound site as taught by Aprahamian because doing so improves the contrast between healthy tissues and inflamed tissues (col. 5 lines 36-37 of Aprahamian).
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 2-10, 21, 23, 25, 26 and 31-45 provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 2-3, 5-9, and 13-20 of copending Application No. 19025440 (reference application) and claims 2-3, 6-7, 9-17 of copending Application No. 19/024,995. Although the claims at issue are not identical, they are not patentably distinct from each other because the above noted claims practically directed to similar or obvious inventions of wound detecting with fluorescence (i.e., UV bacterial infection, etc.).
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Response to Arguments
Applicant's arguments have been fully considered but they are not persuasive at least for the reasons noted below;
Regarding the rejections of the claims, the applicant argues the following;
Chhibber does not disclose or suggest "a display including a touchscreen for user control of imaging" or a processor configured to ... "based on the fluorescence intensities, calculate data indicative of a biodistribution of bacteria at the wound site" as recited in amended claim 2 as well as new independent claims 37 and 41. Applicant respectfully requests reconsideration of the rejection over Chhibber and allowance of the pending claims.
Initially, it is noted that the applicant appears to be draw a conclusory statement without produce contrary evidence to rebut the prima facie case. As clearly outlined in MPEP 2145 rebuttal evidence and arguments can be presented in the specification, In re Soni, 54 F.3d 746, 750, 34 USPQ2d 1684, 1687 (Fed. Cir. 1995), by way of an affidavit or declaration under 37 CFR 1.132, e.g., Soni, 54 F.3d at 750, 34 USPQ2d at 1687; In re Piasecki, 745 F.2d 1468, 1474, 223 USPQ 785, 789-90 (Fed. Cir. 1984), or otherwise presented during prosecution. See, e.g., MPEP §§ 714 to 716et seq. However, arguments presented by applicant cannot take the place of factually supported objective evidence. See, e.g., In re Schulze, 346 F.2d 600, 602, 145 USPQ 716, 718 (CCPA 1965); In re De Blauwe, 736 F.2d 699, 705, 222 USPQ 191, 196 (Fed. Cir. 1984). Rather, the applicant MERELY appears to draw a conclusion that the references do not teach.
In addition, the applicant appears to have narrow interpretation of the VERY BROAD claims. The applicant is reminded that, as per MPEP 2111, claims must be given their broadest reasonable interpretation consistent with the specification though understanding the claim language may be aided by explanations contained in the written description. Yet, it is important not to import into a claim limitation that are not part of the claim. In light of the MPEP, it is noted that claim merely requires.
Further, Chhibber teaches method comprises acquiring a white-light image and an ultraviolet (UV) image of a portion of a body surface, such as a person's face, each of the white-light and UV images including a plurality of pixels and each pixel in the UV image corresponding to a respective pixel in the white-light image (abst). Flash of UV light should include a band of UV wavelengths the can causes the skin associated with the subject 101 to fluoresce [0050]. Living organisms fluoresce upon excitation through the absorption of light, a phenomenon known as autofluorescence, it has been shown that different organisms can be classified through their Stokes shift values. Stokes shift is the difference between peak wavelength or frequency of an absorption spectrum and peak wavelength or frequency of an emission spectrum. Furthermore, UV light can penetrate deeper into the skin than visible light, making it possible to detect subsurface skin conditions… autofluorescence of the skin and image processing technologies to provide automated detection and analysis of subsurface skin condition [0051].
Chhibber further teaches based at least in part on the detected fluorescence response and, with a processor of the portable, handheld imaging system (see re-produced figs 1 and 2A below; and “camera 200 is converted from a conventional, off-the-shelf digital camera, such as the one shown in FIG. 2C, by adding the light sources 120 on the sides and the bottom” [0041]), generating an output indicative of a presence of one or more of epithelialization, granulation, inflammation, reduced nicotinamide adenine dinucleotide (NADH), flavin adenine dinucleotide (FAD), bacteria, infection, reparation, and a healing response in the tissue target (“skin pixel has a white color and an intensity value exceeds 130, the skin pixel is likely one of a group of contiguous pixels that have captured fluorescence coming from an inflamed pore upon illumination by a UV flash. To confirm, surrounding skin pixels are also examined to see if some of them are also white in color and have intensity values over 130. If none or few of the pixels satisfy this criteria, the first skin pixel is not associated with an inflamed pore. Otherwise, an inflamed pore is identified, and in step 1330, the number of skin pixels associated with the inflamed pore is determined as a measure for the shape and size of the pore” [0064]; “skin condition includes at least one type of pores selected from the group consisting of: inflamed pores, bacteriostatic pores, sluggish oil flow, and deeply inflamed pores” also see claim 21 of Chhibber).
Accordingly, Chhibber teaches the newly added claims as shown above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SERKAN AKAR whose telephone number is (571)270-5338. The examiner can normally be reached 9am-5pm M-F.
Examiner interviews are available vi