DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Claims 9-15 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected invention, there being no allowable generic or linking claim. Applicant timely traversed the restriction (election) requirement in the reply filed on November 20, 2025.
Applicant's election with traverse of Invention I, claims 1, 3-8 in the reply filed on November 20, 2025 is acknowledged. The traversal is on the ground(s) that the International Searching Authority (ISA) did not reject these claims for lack of unity of invention, and therefore Applicant submits that the present case falls into the situation where the MPEP urges the Examiner to take the broad practical approach and examine the claims, instead of making a narrow academic distinction. This is not found persuasive because the amended claims are still considered to lack unity of invention because even though the inventions of the Groups I, II, III, IV and V require the technical feature of amended claim 1, this technical feature is not a special technical feature as it does not make a contribution over the prior art (see the rejection of claim 1). Further, Applicant’s argument that lack of unity does not exist since the ISA has not presented the Groups as lacking unity of invention is not persuasive as lack of unity has been demonstrated.
The requirement is still deemed proper and is therefore made FINAL.
Claim Objections
Claim 8 is objected to because of the following informalities:
In claim 8, in line 2, --- of the set of color conversion functions --- should be inserted after “function”.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 3-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Granneman (US Pub No. 2018/0220052) in view of Kobayashi (US 2007/0093691).
With regards to claims 1 and 3, Grannerman discloses an image processor for a medical fluorescence observation device (paragraphs [0057]-[0058], [0066], referring to the image processing circuitry (30) comprising of a processor; Figure 1), wherein the image processor is configured to perform operations comprising:
retrieve a digital white-light color image (i.e. “visible light images”) of an object recorded in a first imaged spectrum (i.e. visible light spectrum) (paragraphs [0056]-[0057], referring to the digitized signals each representing streams of images or image representations based on the data and visible light images of image signals (27); paragraphs [0046]-[0047], referring to emitting visible light; Figures 4, 6, referring to steps 301, 401, “Receive first image stream including visible light images having first color space”);
retrieve a digital fluorescence-light color image (“images based on detected fluoresce of the object”) recorded in a second imaged spectrum, the second imaged spectrum overlapping with a fluorescence emission spectrum of at least one fluorophore and being different from the first imaged spectrum, both the first imaged spectrum and the second imaged spectrum overlapping with a visible spectrum (paragraphs [0046]-[0047], referring to fluorescence excitation light; paragraphs [0056]-[0057], referring to the fluoresced light data in signal (29); Figures 4, 6, steps 302, 402, “Receive second image stream including images based on detected fluoresced light”; paragraph [0051], referring to “the fluoresced light is in a spectrum detectable by light sensor 20 that is in or near the visible light spectrum typically detected by a RGB sensor arrays”, and thus both the first imaged spectrum (i.e. visible light spectrum) and the second imaged spectrum (i.e. fluoresced light spectrum) overlaps with a visible spectrum; Figures 4, 6, referring to steps 302, 402, “Receive second image stream including images based on detected fluoresced light”); and
output at least one digital output color image of a set of digital output color images containing: a digital output color image generated by the image processor from only the digital fluorescence-light color image, a digital output color image generated by the image processor from only the digital white-light color image and a digital output color image generated by the image processor from a combination of the digital fluorescence-light color image and the digital white-light color image (paragraph [0069], referring to the image processing circuitry (30) which performs digital image processing function for a white-light imaging modality to process and combine visible light images of image signal (27) with the fluoresced light data in signal (29) to produce the desired form of image from the data received; paragraph [0099], referring to image combining occurring at block 313, which combines the converted first image stream and the transformed second image stream into a combined image stream; paragraphs [0063]-[0065], [0096], [0100], referring to the signal being processed by a display controller (82) and presented on an image display (88), wherein the image of the fluorescent data is displayed combined or overlaid on the visible color image (i.e. digital output color image generated from a combination of the digital fluorescence light color image and the digital white-light color image); Figures 1-2, 4, 6);
compute the digital output color image generated from only the digital fluorescence- light color image by applying at least a first color conversion function of a set of color conversion functions to the fluorescence-light color image, wherein the image processor comprises the set of color conversion functions, each respective color conversion function of the set of color conversion functions being configured to map a color of an input pixel to a different color of an output pixel (paragraphs [0093], referring to transforming the second image stream to a portion of the second color space outside compressed color space produced by block 305, or if no compressed color space is used for the white-light images at block 305, block 306 transforms the second image stream to a desired color or color range that has been chosen to be highly visible when overlaid with visible light images, wherein such transformations correspond a first color conversion function that maps a color of an input pixel to a different color of an output pixel; paragraphs [0078]-[0080], referring to scaling luminance values of the F1 images, wherein “scaling” corresponds to a first color conversion function; paragraph [0096], referring to transforming the second image stream by using intensity scaling or color transformation, which corresponds to a first color conversion function; paragraph [0020], referring to the color space conversion performed by the image processing circuitry and formats the FI image stream to a color format inside the second color space and outside the first color space; Figures 4-7);
compute the digital output color image generated from only the digital white-light color image by applying at least a second color conversion function of the set of color conversion functions to the white-light color image (paragraph [0103], referring to the format conversion at block 405 (i.e. converting first image stream to larger color space) being directly calculated with a matrix multiplication (i.e. second color conversion function) of the RGB values in the first color stream; paragraph [0020], referring to the color space conversion performed by the image processing circuitry and converts a format of the WL image stream into a second data format having a second color space larger than its original color space; Figures 4-7); and
compute the digital output color image generated from the combination of the digital fluorescence-light color image and the digital white-light color image by applying at least a third color conversion function of the set of color conversion functions to: the white-light color image, the fluorescence-light color image and/or the combination of the digital fluorescence-light color image and the digital white-light color image (paragraphs [0093], referring to transforming the second image stream to a portion of the second color space outside compressed color space produced by block 305, or if no compressed color space is used for the white-light images at block 305, block 306 transforms the second image stream to a desired color or color range that has been chosen to be highly visible when overlaid with visible light images; paragraph [0099], referring to combining the converted first image stream and the transformed second image stream into a combined image stream, wherein the combination may be done by overlaying or alpha blending the images, and thus the combination of the converted first image stream and the transformed second image stream would result in a third color space; paragraphs [0102]-[0103], referring to transforming the first color space to a new, second, data format which has a larger color space than the first color space, wherein the second color space can be defined by at least three or four primaries; Figures 4-7).
However, Granneman does not specifically disclose that the operations further comprises receive a display selection signal from a selector device and select the at least one digital output color image from the set for outputting depending on the display selection signal.
Further, with regards to claim 3, Granneman does not specifically disclose that the image processor is configured to output simultaneously at least two digital output color images of the set.
Kobayashi discloses an electronic endoscope that can generate and display a normal image and a high-quality fluorescent image of the same subject as the normal image, as real time images by brightness control of a single imaging optical system (Abstract; paragraph [0007]). The electronic endoscope includes a mode selection button (44) on the surface of the processor (30), wherein depression of the mode selection button (44) allows a plurality of image modes to be selected (paragraph [0032]; Figure 1). That is, a normal image mode where a normal image (i.e. “digital output color image generated by the image processor from only the white-light color image”) is generated based on the reflected light of the white light and displayed on the monitor 60; a fluorescent image mode where a fluorescent image (i.e. “digital output color image generated by the image processor from only the digital fluorescence-light color image”) based on the fluorescent light is generated and displayed; a plurality images mode where a normal image and a fluorescent image are simultaneously generated and displayed (i.e. “image processor is configured to output simultaneously at least two digital output color images of the set”); and a pseudo-color image mode where a pseudo-color image (i.e. “digital output color image generated by the image processor from a combination of the digital fluorescence-light color image and the digital white-light color image”) based on the reflected light of the white light and the fluorescent light (i.e., corresponding to the normal image signals and the fluorescent image signals) is generated and displayed can be selected (paragraph [0032]).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have the operations of Granneman further comprise receive a display selection signal from a selector device and select the at least one digital output color image from the set for outputting depending on the display selection signal and further have the image processor of Granneman be configured to output simultaneously at least two digital output color images of the set, as taught by Kobayashi, in order to selectively display real-time images representing the same subject at the time of receiving the reflected light and the fluorescent light, thereby aiding in providing diagnosis of the examined tissue (Abstract; paragraph [0007]).
With regards to claim 4, Granneman discloses that the digital white-light color image is representative of a reflectance image of the object under illumination with a fluorescence excitation spectrum of the at least one fluorophore, and/or the digital fluorescence-light color image is representative of fluorescence emission of the at least one fluorophore (paragraphs [0022], [0045]-[0049], [0060], referring to detecting reflected light components for a white-light (WL) modality and detecting fluoresced light components for a fluorescence imaging (FI) modality; Figures 4-7).
With regards to claim 5, Granneman discloses that the digital white-light color image is representative of a reflectance image of the object under white-light illumination, and/or the digital fluorescence-light color image is representative of fluorescence emission of the at least one fluorophore (paragraphs [0045]-[0049], referring to the light source illuminating the subject scene with visible light and fluorescent excitation light, wherein an endoscope is capable of white-light and fluorescence imaging modalities; Figures 1, 4-7).
With regards to claim 6, Granneman discloses that the digital white-light color image and the digital fluorescence-light color image are representative of a reflectance image of the object under white-light illumination (paragraphs [0046]-[0048], referring to white-light/visible light imaging; paragraph [0051], referring to “the fluoresced light is in a spectrum detectable by light sensor 20 that is in or near the visible light spectrum typically detected by a RGB sensor arrays”, and thus both the white-light color image and fluorescence-light color image are representative of an object under white-light illumination; Figures 4-7).
With regards to claim 7, Granneman discloses that the first imaged spectrum and the second imaged spectrum are complementary to one another (paragraphs [0009], [0051], [0071], [0090], referring to image signal (27) undergoing color space conversion which may involve compressing the color space to allow “better” distinguishability from fluorescent display colors, and allowing the FI images to be combined with the first image stream “without any color overlap”, thus improving the ability to visually distinguish the FI images, and therefore the first imaged spectrum is complementary to the second imaged spectrum as they do not overlap; Figure 5).
Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Granneman in view of Kobayashi, as applied to claim 1 above, and further in view of Higgins (US Pub No. 2005/0083352).
With regards to claim 8, as discussed above, the above combined references meet the limitations of claim 1. Further, Granneman discloses that the linear transformation comprises a color conversion matrix having a dimension of a number X times a number Y (paragraphs [0015], [0033], [0073], referring to the conversion being calculated using a matrix multiplication of the RGB values in the first color stream and referring to the color conversion matrix, wherein a matrix is inherently defined by a dimension of a number (i.e. X) times another number (i.e. Y)). Granneman further discloses that the first color space is defined by “at least three primaries” (paragraph [0088]).
However, though Granneman discloses that the first color space is defined by “at least three primaries” (paragraph [0088]), Granneman do not specifically disclose that each color conversion function is a 3x3, a 6x3 or a 3x6 matrix.
Higgins discloses a method for converting from a source color space to a target color space, wherein the source color space results from a combination of N primary color points and the target color space results from a combination of a N+1 or more primary color points in the target color space, thus providing multiple primary conversions (Abstract; paragraph [0038]). Matrices may be combined together to perform conversion directly without going through intermediate color spaces (paragraph [0038]). As depicted in Figure 5, a 3x6 matrix may be used for converting 3-valued colors for a 6-primary display (paragraph [0046]; Figure 5, wherein it is depicted that a X by Y matrix is used, wherein X is the total number of a quantity of the first color bands (i.e. source color bands, which is equal to 3 color bands in Granneman) and Y is the quantity of target color bands (i.e. target color bands, which is equal to 3 target color bands in Granneman; note that in Granneman, the color conversion function would hence be a 3x3 matrix as the RGB color space is converted to a different RGB color space).
Before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to have each color conversion function of Granneman be a 3x3 matrix, as taught by Higgins, in order to effectively provide multiple primary color conversions (paragraph [0038]).
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1 and 3-8 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-4 of copending Application No. 18/864,581 in view of Kobayashi.
With regards to claims 1 and 3, claims 1-4 of the copending application meets most of the limitations of instant claim 1 (i.e. retrieving a digital white-light color image, retrieving a digital fluorescence-light image and generating a digital output color image from a combination of the digital fluorescence-light color image and the digital white-light color image, computing the digital output color image generated from only the digital fluorescence-light color image by applying at least a first color conversion function of a set of color conversion functions to the fluorescence-light color image, computing the digital output color image generated from only the digital white-light color image by applying at least a second color conversion function to the white-light color image and computing the digital output color image generated from the combination of images by applying at least a third color conversion function to the white-light color image, the fluorescence-light color image and/or the combination of the images, wherein the steps are inherently/necessarily performed by a processor/computer (see copending claims 3-4, wherein the mapping of the first and second color bands onto third color bands using a linear transformation comprising a color conversion matrix, wherein each first color band of the first color bands and each second color band of the second color bands are input as separate color bands into the linear transformation, corresponds to using color conversion functions to the fluorescence-light color image and to the white-light color image to compute the respective digital output color images as claimed in instant claim 1).
However, the copending application does not specifically disclose that the operations further comprise output at least one digital output color image of a set of digital output color images containing: a digital output color image generated by the image processor from only the digital fluorescence-light color image; a digital output color image generated by the image processor from only the digital white-light color image; and a digital output color image generated by the image processor from a combination of the digital fluorescence-light color image and the digital white-light color image; receive a display selection signal from a selector device; and select the at least one digital output color image from the set for outputting depending on the display selection signal. Further, the copending application does not specifically disclose that the image processor is configured to output simultaneously at least two digital output color images of the set.
Kobayashi discloses an electronic endoscope that can generate and display a normal image and a high-quality fluorescent image of the same subject as the normal image, as real time images by brightness control of a single imaging optical system (Abstract; paragraph [0007]). The electronic endoscope includes a mode selection button (44) on the surface of the processor (30), wherein depression of the mode selection button (44) allows a plurality of image modes to be selected (paragraph [0032]; Figure 1). That is, a normal image mode where a normal image (i.e. “digital output color image generated by the image processor from only the white-light color image”) is generated based on the reflected light of the white light and displayed on the monitor 60; a fluorescent image mode where a fluorescent image (i.e. “digital output color image generated by the image processor from only the digital fluorescence-light color image”) based on the fluorescent light is generated and displayed; a plurality images mode where a normal image and a fluorescent image are simultaneously generated and displayed (i.e. “image processor is configured to output simultaneously at least two digital output color images of the set”); and a pseudo-color image mode where a pseudo-color image (i.e. “digital output color image generated by the image processor from a combination of the digital fluorescence-light color image and the digital white-light color image”) based on the reflected light of the white light and the fluorescent light (i.e., corresponding to the normal image signals and the fluorescent image signals) is generated and displayed can be selected (paragraph [0032]).
At the time of invention, it would have been obvious to one of ordinary skill in the art to have the operations of the copending application further comprise output at least one digital output color image of a set of digital output color images containing: a digital output color image generated by the image processor from only the digital fluorescence-light color image; a digital output color image generated by the image processor from only the digital white-light color image; and a digital output color image generated by the image processor from a combination of the digital fluorescence-light color image and the digital white-light color image; receive a display selection signal from a selector device; and select the at least one digital output color image from the set for outputting depending on the display selection signal and have the image processor of the copending application be configured to output simultaneously at least two digital output color images of the set, as taught by Kobayashi, in order to selectively display real-time images representing the same subject at the time of receiving the reflected light and the fluorescent light, thereby aiding in providing diagnosis of the examined tissue (Abstract; paragraph [0007]).
With regards to instant claims 4 and 5, claim 1 of the copending application sets forth the same limitations (i.e. the second imaged spectrum overlapping with a fluorescence emission spectrum of at least one fluorophore would require that the digital fluorescence-light color image is representative of fluorescence emission of the at least one fluorophore).
With regards to instant claim 6, claim 1 of the copending application sets forth the same limitations (i.e. a digital “white-light” color image is inherently representative of a reflectance image of the object under white-light illumination and further the second imaged spectrum overlapping with the visible spectrum would inherently required that the digital fluorescence-light color image are representative of a reflectance image of the object under the white-light illumination as well).
With regards to instant claim 7, claim 2 of the copending application sets forth the same limitations.
With regards to instant claim 8, claim 4 of the copending application sets forth the same limitations (i.e. the color conversion matrix can correspond to a 3x3 or 6x3 (or 3x6) matrix)).
This is a provisional nonstatutory double patenting rejection.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Fengler et al. (US Pub No. 2011/0063427) disclose a NIR endoscopic imaging device to provide a continuous color image and either a superimposed or side-by-side display of the NIR image information (paragraphs [0031]-[0032], [0043]).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHERINE L FERNANDEZ whose telephone number is (571)272-1957. The examiner can normally be reached Monday-Friday 9:00 AM - 5:30 PM (ET).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached at (571) 272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KATHERINE L FERNANDEZ/Primary Examiner, Art Unit 3798