DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Status
Claims 1-20 were pending for examination in Application No. 18/484,306 filed October 10th, 2023. In the remarks and amendments received on December 9th, 2025, claims 1-8, 10-17, and 19 are amended, no claims are cancelled, and no claims are added. Accordingly, claims 1-20 are currently pending for examination.
Response to Amendment
Applicant’s amendments filed December 9th, 2025, have overcome the objections previously set forth in the Non-Final Office Action mailed October 21st, 2025. Accordingly, the objections are withdrawn.
Response to Arguments
Applicant’s arguments filed December 9th, 2025, with respect to the rejection of claims 1-2, 10-11, and 19, have been fully considered but are moot because the arguments do not apply to the new combination of references, facilitated by Applicant’s newly submitted amendments being used in the current rejection.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5, 10-12, 14, and 19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Heitz III et al. (US-20210400167-A1).
Regarding claim 1, Heitz III teaches:
A processor comprising: one or more processing units (“a computing system (e.g., a home system) that includes one or more processors and memory,” Para [0007]) to:
generate a frame of infrared (IR) image data (“obtaining… (b) infrared (IR) video data,” Para [0006]) and a frame of color image data (“obtaining… (a) color video data,” Para [0006]) having at least partially overlapping fields of view (“Colorizing the images, e.g., based on prior color images of the same scene or objects,” Para [0024], where the “images” are IR);
and generate a frame of synthesized color image data colorizing the frame of IR image data using the frame of color image data (“colorizing the IR video data based on a subset of the color video data,” Para [0006]) based at least on a determination that one or more first classified regions of the frame of color image data (“common features” in the “prior image data” which “includes, one or more images in an RGB color space,” Para [0156]) correspond to one or more second classified regions (“common features”) of at least one of the frame of synthesized color image data or the frame of IR image data (“the prior image data is selected from a set of stored image data (e.g., prior images 344) based on one or more common features with the IR images, e.g., objects determined to be in similar positions in both the prior image and the IR image,” Para [0156]).
Regarding claim 2, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, and the one or more processing units further to colorize, for each successive frame of one or more frames of a video feed of a monitoring system (“one or more video cameras,” Para [0006]), a corresponding successive frame of IR image data using a corresponding frame of color image data (“colorizing the IR video data based on a subset of the color video data; and (3) presenting the colorized video data to a user in real time,” Para [0006], i.e., video data is colorized in-real time).
Regarding claim 3, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, and the one or more processing units further to transfer at least one of one or more colors or one or more color statistics of the one or more first classified regions of the frame of color image data (“The prior images 814-1 and 814-3 show that the mat 813 has square blocks that are yellow, green, and red in color,” Para [0164]) to the one or more second classified regions of the at least one of the frame of synthesized color image data or the frame of IR image data (“The prior images 814-1 and 814-3 show that the mat 813 has square blocks that are yellow, green, and red in color,” Para [0164], and “FIG. 8B further shows a colorization of the IR image 812 based on the prior images 814 in colorized image 816,” Para [0164]).
Regarding claim 5, Heitz III teaches the processor of claim 1, further teaching the one or more processing units further to transfer at least one of one or more colors or one or more color statistics of one or more segmented body parts (of “Jack and Jill”; “entities are identified as humans, pets, and cars… humans are identified as particular individuals known to the smart home system (e.g., identified as Jack and Jill respectively). In some implementations, the entity IR data is colorized (714-2) based on its identification,” Para [0154]) corresponding to the one or more first classified regions of the frame of color image data (“The prior images 814-1 and 814-3 show that the mat 813 has square blocks that are yellow, green, and red in color,” Para [0164]) to one or more corresponding body parts (of “Jack and Jill”) corresponding to the one or more second classified regions of at least one of the frame of synthesized color image data or the frame of IR image data (“The prior images 814-1 and 814-3 show that the mat 813 has square blocks that are yellow, green, and red in color,” Para [0164], and “FIG. 8B further shows a colorization of the IR image 812 based on the prior images 814 in colorized image 816,” Para [0164]).
Claims 10-12, and 14 are system claims that correspond to processor claims 1-3, and 5. Claims 10-12, and 14 are rejected for the same reasons as claims 10-12, and 14.
Claim 19 is a method claim that corresponds to processor claim 1. Claim 19 is rejected for the same reason as claim 1.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 4 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heitz III et al. (US-20210400167-A1) as applied to claims 1 and 10 above, and further in view of Qu et al., "Manga Colorization", ACM Transactions on Graphics (TOG), Volume 25, Issue 3, hereinafter referred to as "Qu".
Regarding claim 4, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, but fails to teach the following limitations as further claimed. Qu, however, further teaches the one or more processing units further to colorize at least a portion of the IR image data (Qu, Fig. 1(d)) based at least on propagating one or more colors of the one or more classified regions of the frame of color image data (Heitz III, Fig. 8B, “prior images” 814) as one or more seed colors to the one or more second classified regions of at least one of the frame of synthesized color image data or the frame of IR image data (Qu, Fig. 1(a), the lines indicating a color in a specific region of the image Fig. 1(a)), and using the one or more seed color as input designating one or more target regions to colorize (Qu, Fig 1(d), a result from the color input of Fig. 1(a) where segmented regions are fully colored).
PNG
media_image1.png
369
1084
media_image1.png
Greyscale
PNG
media_image2.png
484
667
media_image2.png
Greyscale
Qu is considered to be analogous to the claimed invention because they are in the same field of colorizing black and white or infrared images with color. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have incorporated the teachings of Qu into Heitz III for the benefit of complete colorization with a small amount of input.
Claim 13 is a system claim that corresponds to processor claim 1. Claim 13 is rejected for the same reason as claim 1.
Claim(s) 6-8 and 15-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heitz III et al. (US-20210400167-A1) as applied to claims 1 and 10 above, and further in view of Li et al., "Automatic Example-Based Image Colorization Using Location-Aware Cross-Scale Matching," in IEEE Transactions on Image Processing, vol. 28, no. 9, pp. 4606-4619, hereinafter referred to as "Li".
Regarding claim 6, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, but fails to teach the following limitations as further claimed. Li, however, teaches the one or more processing units further to transfer at least one of one or more colors or one or more color statistics of the one or more first classified regions of the frame of color image data (Fig. 4(b), the “reference image”) to the one or more second classified regions of the synthesized color image data (Figs. 4(d) and (e)) based at least on determining that a difference (“location violation”) between the one or more first classified regions of the frame of color image data (Fig. 4(b)) and the one or more second classified regions of the synthesized color image data (Fig. 4(e)) exceeds a threshold (“semantically incorrect colorization is related to up-down location violation (i.e. grass should not appear above the sky), and such location based “knowledge” can be automatically learnt from the single reference image,” pg. 4611, Section III, Part B).
PNG
media_image3.png
366
896
media_image3.png
Greyscale
Li is considered to be analogous to the claimed invention because they are in the same field of transferring color from an RGB image to an infrared or black and white image. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date to have incorporated the teachings of Li into Heitz III for the benefit of accurate colorized infrared or black and white images.
Regarding claim 7, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, but fails to teach the following limitations as further claimed. Li, however, teaches the one or more processing units further to encode the frame of synthesized color image data (Fig. 4(d)) into a latent representation (transferring color to transform Fig. 4(a) into 4(d)), modify one or more dimensions of the latent representation to generate a modified latent representation (Fig. 4(g), the modified image of Fig. 4(d)), and decode the modified latent representation (Fig. 4(h), the changed colors are now shown with the coloring errors fixed).
It would have been obvious to one of ordinary skill in the art before the effective filing date to have incorporated the teachings of Li into Heitz III for the benefit of accurate colorized infrared or black and white images.
Regarding claim 8, the rejection of claim 1 is incorporated herein. Heitz III teaches the processor of claim 1, but fails to teach the following limitations as further claimed. Li, however, teaches the one or more processing units further to determine to modify one or more dimensions of a latent representation of the frame of synthesized color image data (Figs. 4(d) to (g), where coloring errors are fixed “from the single reference image”) based at least on determining that a difference (“location violation”) between one or more segmented regions of the frame of color image data (Fig. 4(b)) and one or more corresponding segmented regions of the frame of synthesized color image data (Fig. 4(e)) exceeds a threshold (“semantically incorrect colorization is related to up-down location violation (i.e. grass should not appear above the sky), and such location based “knowledge” can be automatically learnt from the single reference image,” pg. 4611, Section III, Part B).
It would have been obvious to one of ordinary skill in the art before the effective filing date to have incorporated the teachings of Li into Heitz III for the benefit of accurate colorized infrared or black and white images.
Claims 15-17 are system claims that correspond to processor claims 6-8. Claims 15-17 are rejected for the same reasons as claims 6-8.
Claim(s) 9, 18, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heitz III et al. (US-20210400167-A1) as applied to claims 1, 10, and 19 above, and further in view of Dong et al., "A Colorization Framework for Monochrome-Color Dual-Lens Systems Using a Deep Convolutional Network," in IEEE Transactions on Visualization and Computer Graphics, vol. 28, no. 3, pp. 1469-1485, hereinafter referred to as “Dong”.
Regarding claim 9, Heitz III teaches the processor of claim 1, but fails to teach the following limitations as further claimed. Dong, however, further teaches wherein the processor is comprised in at least one of:
a control system for an autonomous or semi-autonomous machine;
a perception system for an autonomous or semi-autonomous machine;
a system for performing simulation operations;
a system for performing digital twin operations;
a system for performing light transport simulation;
a system for performing collaborative content creation for 3D assets;
a system for performing deep learning operations (Dong, “The proposed deep convolutional network is implemented with TensorFlow,” pg. 1475, Section 5, Part 5.2);
a system for performing real-time streaming;
a system implemented using an edge device;
a system implemented using a robot;
a system for performing conversational AI operations;
a system for generating synthetic data;
a system incorporating one or more virtual machines (VMs);
a system implemented at least partially in a data center; or
a system implemented at least partially using cloud computing resources.
It would have been obvious to one of ordinary skill in the art to have incorporated the teachings of Dong into Heitz III for the benefit of automatic feature learning.
Examiner’s Note: Claim 9 as recited is treated as a “field of use” or “intended use” limitation and therefore carries no patentable weight although it has been examined in view of Heitz III in view of Dong above. The processor as recited has been examined as evidenced in claim 1 above. With respect to the enumerated environments that said processor is “comprised in”, the specification as disclosed merely mentions these environments as preferred intended use environments without specific details that warrant said processor comprised in these environments resulted in a novel and non-obvious structural change to the processor. Reference to MPEP 2112.01 is also made for applicant’s attention.
Claims 18 is a system claim that corresponds to processor claim 9. Claim 18 is rejected for the same reasons as claim 9.
Claim 20 is a method claim that corresponds to processor claim 1. Claim 20 is rejected for the same reason as claim 1.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RACHEL A OMETZ whose telephone number is (571)272-2535. The examiner can normally be reached 6:45am-4:00pm ET Monday-Thursday, 6:45am-1:00pm ET every other Friday.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vu Le can be reached at 571-272-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Rachel Anne Ometz/Examiner, Art Unit 2668 1/20/26
/VU LE/Supervisory Patent Examiner, Art Unit 2668