Prosecution Insights
Last updated: April 19, 2026
Application No. 18/485,970

TONE MAPPING IN HIGH-RESOLUTION IMAGING SYSTEMS

Non-Final OA §102§103
Filed
Oct 12, 2023
Examiner
ANSARI, TAHMINA N
Art Unit
2674
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
743 granted / 868 resolved
+23.6% vs TC avg
Strong +18% interview lift
Without
With
+17.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
33 currently pending
Career history
901
Total Applications
across all art units

Statute-Specific Performance

§101
12.2%
-27.8% vs TC avg
§103
40.4%
+0.4% vs TC avg
§102
22.6%
-17.4% vs TC avg
§112
10.5%
-29.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 868 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status Claims 1-20 are pending in this application. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. Examiner Note 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim s 1-20 are directed towards statutory subject matter is under 35 U.S.C. 101 because the claimed invention is NOT directed to a judicial exception . The claim s fall within at least one of the four categories of patent eligible subject matter because the claimed invention is directed to a judicial exception (i.e., abstract idea – an idea to itself/mathematical concept) without significantly more. (1) Are the claims directed to a process, machine, manufacture or composition of matter; (2A) Prong One: Are the claims directed to a judicially recognized exception, i.e., a law of nature, a natural phenomenon, or an abstract idea; Prong Two: If the claims are directed to a judicial exception under Prong One, then is the judicial exception integrated into a practical application; (2B) If the claims are directed to a judicial exception and do not integrate the judicial exception, do the claims provide an inventive concept. (Step 1) In the context of the flowchart in MPEP § 2106, subsection III, Step (1): Are the claims directed to a process, machine, manufacture or composition of matter? YES: Independent Claim 1 is a method, Claim 9 a device, and Claim 17 is a non-transitory medium. (Step 2A) In the context of the flowchart in MPEP § 2106, subsection III, Step 2A Prong Two determines whether: Is the claim directed to a law of nature, a natural phenomenon, or an abstract idea? NO When viewed under the broadest most reasonable interpretation, the instant claims are NOT directed to a Judicial Exception . Although there are elements of a mathematical concept and image processing algorithm applied, the actual claims are directed towards an image correction and generation process based on the features claimed, presented below for clarity. A s a whole , the claimed features can be interpreted as statutory subject matter . 1. A method comprising- obtaining multiple image frames captured using at least one imaging sensor; generating a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor, generating a blended and demosaiced image based on the image frames; generating a local tone mapped image based on the blended and demosaiced image and the local tone map; adjusting color saturation based on the local tone mapped image to generate a corrected image; and generating an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. Since the claim as a whole does integrate the exception into a practical application, in which case the claim is directed to the judicial exception (Step 2A: NO ), it does not require further analysis but does also qualify as having elements that amount to “significantly more” analysis under Step 2B (where it may still be eligible if it amounts to an inventive concept). (Step 2B) In the context of the flowchart in MPEP 2106, subsection III, Step 2B determines whether: Does the claim recited additional elements that amount to significantly more than the judicial exception? YES . The instant claims do apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception of image correction and therefore does integrate the judicial exception into a practical application. In accordance with the MPEP § 2106.04(d) Integration of a Judicial Exception Into A Practical Application [R-07.2022] , Similarly, in a growing body of decisions, the Federal Circuit has distinguished between claims that are ‘‘directed to’’ a judicial exception (which require further analysis to determine their eligibility) and those that are not (which are therefore patent eligible), e.g., claims that improve the functioning of a computer or other technology or technological field. See Diamond v. Diehr, 450 U.S. 175, 209 USPQ 1 (1981); Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 6 73 (1972). See, e.g., MPEP § 2106.06(b) (summarizing Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 118 USPQ2d 1684 (Fed. Cir. 2016), McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 120 USPQ2d 1091 (Fed. Cir. 2016), and other cases that were eligible as improvements to technology or computer functionality instead of being directed to abstract ideas)” “Accordingly, after determining that a claim recites a judicial exception in Step 2A Prong One, examiners should evaluate whether the claim as a whole integrates the recited judicial exception into a practical application of the exception in Step 2A Prong Two . A claim that integrates a judicial exception into a practical application will apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception .” See MPEP § 2106.05 for discussion of Step 2B. With respect to the claimed limitations, the following features are directed to a mathematical process that do es integrate the judicial exception into a practical application, as there is a “meaningful limit” amounting to significantly more . The claimed features upon invok ing the analysis of the features under MPEP § 2106.05(a) are determined to comprise features that do qualify as improvements to the functioning of a computer or to any other technology or technical field , and d o qualify as “significantly more” . Dependent claims are also determined to be statutory subject matter as they comprise additional features to the independent claims and refine the scope of the invention overall. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale , or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1, 9 and 17 are rejected under 35 U.S.C. 102( a)(1 ) as being anticipated by Douady-Pleven et al. (US PGPub US2020 / 0260001A1 ), hereby referred to as “Douady-Pleven” . Consider Claims 1, 9 and 17. Douady-Pleven teaches: 1. A method comprising- / 9. An electronic device comprising; / 17. A non-transitory machine readable medium containing instructions that when executed cause at least one processor of an electronic device to. (Douady-Pleven: abstract, A system accesses an image with each pixel of the image having luminance values each representative of a color component of the pixel. The system generates a first histogram for aggregate luminance values of the image, and accesses a target histogram for the image representative of a desired global image contrast. The system computes a transfer function based on the first histogram and the target histogram such that when the transfer function is applied, a histogram of the modified aggregate luminance values is within a threshold similarity of the target histogram. The system modifies the image by applying the transfer function to the luminance values of the image to produce a tone mapped image, and outputs the modified image. ) 1. obtaining multiple image frames captured using at least one imaging sensor; / 9. at least one imaging sensor configured to capture multiple image frames; / 17. obtain multiple image frames captured using at least one imaging sensor; (Douady-Pleven: [ 0029] FIG. 1 is a block diagram illustrating electronic components of a camera 100, according to one embodiment. The camera 100 of the embodiment of FIG. 1 includes one or more processors 102, a system memory 104, a synchronization interface 106, a controller hub 108, one or more microphone controllers 110, an image sensor 112, a lens and focus controller 114, one or more lenses 120, one or more LED lights 122, one or more buttons 124, one or more microphones 126, and I/O port interface 128, a display 130, an expansion pack interface 132, a Bayer scaler 150, a clipping corrector 160, a spatial temporal noise reduction (NR) engine 170, and a global tone mapper 180. [0030] The camera 100 includes one or more processors 102 (such as a microcontroller or a microprocessor) that control the operation and functionality of the camera 100. For instance, the processors 102 can execute computer instructions stored on the memory 104 to perform the functionality described herein. It should be noted that although LUT generation and color model conversion are described herein as performed by the camera 100, in practice, the camera 100 can capture image data, can provide the image data to an external system (such as a computer, a mobile phone, or another camera), and the external system can generate a LUT based on the captured image data. [0098] FIG. 10 illustrates an example high-level block diagram of the spatial temporal noise reduction (NR) engine 170, according to one embodiment. In some embodiments, the spatial temporal noise reduction (NR) engine 170 is a hardware component that processes frames accessed from the processor 102 or image sensor 112. The spatial temporal noise reduction (NR) engine 170 includes an input frames buffer 1000, a weight map 1005, a temporal noise reduction 1010, a spatial noise reduction unit 1015, and a noise reduction blending unit 1030. The spatial temporal noise reduction (NR) engine 170 can include different and/or additional components than those illustrated in the embodiment of FIG. 10 to perform the functionalities described herein. [0099] The input frames buffer 1000 stores a reference image frame received from the processor 102 or image sensor 112. The input frames buffer 1000 also stores one or more image frames that are temporally adjacent to the reference frame. ) 1. generating a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor, / 9. and at least one processing device configured to generate a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor; / 17. generate a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor, (Douady-Pleven: [ 0099] The input frames buffer 1000 stores a reference image frame received from the processor 102 or image sensor 112. The input frames buffer 1000 also stores one or more image frames that are temporally adjacent to the reference frame. The temporally adjacent frames are frames that were captured prior to (or subsequent to) the capturing of the reference frame. These may have been captured immediately prior to (or subsequent to) or there may have been a short period of time between the captures of these frames. For example, if the reference frame is from a captured video and is frame number X, a temporally adjacent frame may be frame number X+n or X−n, where X and n are integer values. Since the reference image frame and the temporally adjacent image frames are captured within a short period of time of one another, the scenes captured by these frames may include scene objects and captured elements that are shared among the frames. For example, the frames may be from a longer video recording, and so the reference image frame and temporally adjacent image frames may be consecutive image frames of the video capturing a particular scene. [0100] The anti-ghosting unit 1020 determines if a ghosting artifact may exist between a portion of the reference image frame and a corresponding portion of the temporally adjacent image frames stored in the input frame buffer 1000. Both the reference image frame and the temporally adjacent image frames have noise generated from the capturing of the scene. This noise can be caused by both dark noise (noise inherent in the image sensor 112) and Poisson noise (noise due the discrete nature of photons), and may be reduced using noise reduction techniques. For example, the pixel intensity values (i.e., luminance) of the reference image frame and each of the temporally adjacent image frames may be averaged together to produce an averaged pixel intensity value result that reduces the randomly generated noise that differs from frame to frame. If four frames are averaged, the standard deviation of the noise variations in the frames is divided by two, providing an increase in the signal to noise ratio (SNR) of approximately 6 decibels (dB). [0101]-[0104] [0105] In one embodiment, the pixel distance value computation may be represented by the following equation: ) 1. generating a blended and demosaiced image based on the image frames; / 9. generate a blended and demosaiced image based on the image frames; / 17. generate a blended and demosaiced image based on the image frames; (Douady-Pleven:) 1. generating a local tone mapped image based on the blended and demosaiced image and the local tone map; / 9. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; / 17. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; (Douady-Pleven: [ 0042] The input buffer 205 receives and/or accesses the image data from the image sensor 112. FIG. 3 illustrates an exemplary representation of the subpixels 310 on the image sensor 112, arranged via a Bayer filter array, according to an embodiment. Each subpixel 310 includes a photosite that determines the intensity of blue, red, or green wavelength light (i.e., photons) incident upon that photosite. The image sensor includes many subpixels 310, typically in the millions. For each red (“R”) and each blue (“B”) subpixel 310, there are two green subpixels 310 (“Gr” and “Gb”). These four subpixels, when combined, may represent a whole pixel and are typically combined in a demosaicing process implemented to determine a color intensity. Referring back to FIG. 2, the input buffer 205 receives all or a portion of the intensity values of these subpixels 310 from the image sensor 112. ) 1. adjusting color saturation based on the local tone mapped image to generate a corrected image; / 9. adjust color saturation based on the local tone mapped image to generate a corrected image; / 17. adjust color saturation based on the local tone mapped image to generate a corrected image; (Douady-Pleven: [0086]-[0091], [ 0087] The color intensity values of some pixels of the image may already be at the maximum saturation value. Each image has a limited dynamic range, and the maximum saturation value is the maximum value in the dynamic range of an image. For example, if each color channel of an image is 8-bits, the maximum saturation value is 255. When an image is captured (or previously adjusted), some color values at some portions of the image may exceed the maximum saturation value. In this case, these (highlight) portions of the image are displayed as white in color. However, the actual color of the scene captured by the image at that portion of the image may not have been white, but some other color, such as a light blue. The change in color in the captured image is due to the loss of information due to the limitations of the dynamic range changing the ratio of the color intensities at that pixel. While the color values are at a high intensity, this error may be less noticeable since the pixel appears to be near-white. However, if that portion of the image is adjusted with an adjustment value having a multiplicative factor less than one, the color values are decreased, and the incorrect ratio between the red, green, and blue colors is more noticeable, as the resulting pixel is no longer white and may have an unusual or unnatural color unrepresentative of the original color (e.g., light blue or grey if all channels are desaturated identically). [0088] To adjust for this issue, the highlight adjustment corrector 720 performs the desaturation operation by determining whether the adjustment value made to a color intensity value of a pixel, or to all three color intensity values of the pixel, is greater than a corrected adjustment value equal to two times the input color value subtracted by the maximum saturation value. In other words, the highlight adjustment corrector 720 determines: out=max(adjust*in, 2(in)−max_saturation) (8) [0 091] FIG. 8 illustrates exemplary color curves applied to an image, according to an embodiment. The saturation adjustment curve 850 illustrates the result of adjusting the color levels of an image to be larger. Some values will reach a saturation point 820, and thus be limited by the maximum dynamic range of the image. However, other values are instead set to a value that is lower than the maximum saturation value. In the illustrated exemplary color curve 850, the highlight adjustment corrector 720 generates a curve that tapers off exponentially such that although high intensity values are near the saturation point, the high intensity values gradually reach the maximum saturation point. ) 1. and generating an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. / 9. and generate an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. / 17. and generate an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. (Douady-Pleven: [0131]-[0138], Figure 12, [0131] FIG. 12 illustrates an example high-level block diagram of the global tone mapper 180, according to one embodiment. In some embodiments, the global tone mapper 180 is a hardware component that processes images accessed from the processor 102 or image sensor 112. The global tone mapper 180 includes a histogram module 1205, an automatic white balance module 1210, an auto-exposure correction module 1215, a target histogram determination module 1220, a histogram transfer module 1225, and an unsharp mask module 1230. The global tone mapper 180 can include different and/or additional components than those illustrated in the embodiment of FIG. 12 to perform the functionalities described herein [0136] The auto-exposure correction module 1215 corrects the image for any undesired auto-exposure. The image may have been over- or under-exposed during capture by the camera 100. In particular, the auto-exposure mechanism of the camera 100 may have underexposed the region of interest of the image in order to preserve the highlights in the image such that the highlight areas stay with the dynamic range of the camera 100. [0137] In one embodiment, to compensate for these exposure errors, the auto-exposure correction module 1215 uses a look up table (LUT) that provides corrected output luminance values for input luminance values without clipping highlight information. The LUT is generated based on a tone curve generated by concatenating a linear curve and a Bezier curve. The slope of the linear curve is based on the amplification to apply to the image to better expose the relevant regions of interest (ROI). This slope may be equal to an exposure bias of 2 ev (2 stops). The Bezier curve portion is based on the following equation: ) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Dey et al. ( WIPO Publication WO2022181995A 1 ), hereby referred to as “ Dey ”, in view of Douady-Pleven et al. ( US PGPub US2020 / 0260001A1 ), hereby referred to as “ Douady-Pleven ”. Dey was cited by applicant in IDS submitted on June 25, 2024. Consider Claims 1, 9 and 17. Dey teaches: 1 . A method comprising- / 9. An electronic device comprising; / 17. A non-transitory machine readable medium containing instructions that when executed cause at least one processor of an electronic device to. (Dey: abstract, In an embodiment, a method, for capturing a high resolution High Dynamic Range (HDR)image is disclosed. The method includes fetching a plurality of facial co-ordinates generated by an Image Signaling Processor (ISP) corresponding to one or more binning frames during an image preview phase. The method includes mapping the plurality of facial co-ordinates into a re-mosaic frame corresponding to a high resolution HDR image during a HDR capture phase. The method includes transmitting the re-mosaic frames comprising the mapped plurality of facial co-ordinates to a HDR library. The method includes processing one or more facial regions in the high resolution HDR image for enhancing a quality associated with the high resolution HDR image. [31] Continuing with the above embodiment, the system 102 may include a processor 202, a memory 204, data 206, module(s) 208, resource (s) 210, a display unit 212, an imaging sensor 214, an Image Signaling processor (ISP) 216, a HDR library 218, and a LTM library 220. In an embodiment, the processor 202, the memory 204, the data 206, the module(s) 208, the resource (s) 210, the display unit 212, the imaging sensor 214, the ISP 216, the HDR library 218, and the LTM library 220 may be communicably coupled with one another. ) 1. obtaining multiple image frames captured using at least one imaging sensor; / 9. at least one imaging sensor configured to capture multiple image frames; / 17. obtain multiple image frames captured using at least one imaging sensor; ( Dey: [8] In accordance with some example embodiments of the present subject matter, a method, for capturing a high resolution High Dynamic Range (HDR)image is disclosed. The method includes fetching a plurality of facial co-ordinates generated by an Image Signaling Processor (ISP) corresponding to one or more binning frames during an image preview phase. The method includes mapping the plurality of facial co-ordinates into a re-mosaic frame corresponding to a high resolution HDR image during a HDR capture phase. The method includes transmitting the re-mosaic frames comprising the mapped plurality of facial co-ordinates to a HDR library. The method includes processing one or more facial regions in the high resolution HDR image for enhancing a quality associated with the high resolution HDR image. [9] In accordance with some example embodiments of the present subject matter, a system, for capturing a high resolution High Dynamic Range (HDR)image is disclosed. The system includes fetching a plurality of facial co-ordinates generated by an Image Signaling Processor (ISP) corresponding to one or more binning frames during an image preview phase. The system includes mapping the plurality of facial co-ordinates into a re-mosaic frame corresponding to a high resolution HDR image during a HDR capture phase. The system includes transmitting the re-mosaic frames comprising the mapped plurality of facial co-ordinates to a HDR library. ) 1. generating a local tone map based on at least one of the image frames and one or more parameters of the at least one imaging sensor; / 9. and at least one processing device configured to generate a local tone map based on at least one of the image frames and one or more parameters of the at least one imaging sensor; /17. generate a local tone map based on at least one of the image frames and one or more parameters of the at least one imaging sensor; (Dey: [47] In response to receiving the one or more re-mosaic frames including the mapped number of facial co-ordinates, the HDR library 218 may be configured to process one or more facial regions in the high resolution HDR image. In an embodiment, processing may be performed for enhancing a quality associated with the high resolution HDR image [ 48] Continuing with the above embodiment, in response to processing the one or more facial regions in the high resolution HDR image, the HDR library 218 may be configured to transmit the high resolution HDR image including the mapped number of facial co-ordinates. In an embodiment, the high resolution HDR image may be transmitted to the Local Tone Mapping library (LTM) library 220. ) 1. generating a blended and demosaiced image based on the image frames; / 9. generate a blended and demosaiced image based on the image frames; / 17. generate a blended and demosaiced image based on the image frames; (Dey: [46] Continuing with the above embodiment, upon mapping the one or more re-mosaic frames with the number facial co-ordinates, the ISP 216 may be configured to transmit the one or more re-mosaic frames including the mapped number of facial co-ordinates to the HDR library 218. [ 47] In response to receiving the one or more re-mosaic frames including the mapped number of facial co-ordinates, the HDR library 218 may be configured to process one or more facial regions in the high resolution HDR image. In an embodiment, processing may be performed for enhancing a quality associated with the high resolution HDR image [48] Continuing with the above embodiment, in response to processing the one or more facial regions in the high resolution HDR image, the HDR library 218 may be configured to transmit the high resolution HDR image including the mapped number of facial co-ordinates. In an embodiment, the high resolution HDR image may be transmitted to the Local Tone Mapping library (LTM) library 220. [56] ) 1. generating a local tone mapped image based on the blended and demosaiced image and the local tone map; / 9. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; / 17. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; ( Dey: [58] Continuing with the above embodiment, the process may include transmitting (step 312) the one or more re-mosaic frames including the mapped number of facial coordinates to the HDR library 218 as referred in the fig. 2. [59] In response to receiving the one or more re-mosaic frames including the mapped number of facial co-ordinates, the process may proceed towards processing (step 314) one or more facial regions in the high resolution HDR image. In an embodiment, the one or more facial regions may be associated with the at least one face in the one or more binning review frames. In an embodiment, processing may be performed for enhancing a quality associated with the high resolution HDR. [60] Continuing with the above embodiment, in response to processing the one or more facial regions in the high resolution HDR image, the process may include transmitting (step 316) the high resolution HDR image including the mapped number of facial coordinates. In an embodiment, the high resolution HDR image may be transmitted to the Local Tone Mapping library (LTM) library 222 as referred in the fig. 2. ) Even if Dey does not teach: "generating a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the plurality of image frames and one or more parameters of the at least one imaging sensor" and “ adjusting color saturation based on the local tone mapped image to generate a corrected image; ” "generating a corrected image by adjusting color saturation based on the local tone mapped image ” ; and “ generating an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs". Douady-Pleven teaches: 1. A method comprising- / 9. An electronic device comprising; / 17. A non-transitory machine readable medium containing instructions that when executed cause at least one processor of an electronic device to. ( D ouady-Pleven : abstract, A system accesses an image with each pixel of the image having luminance values each representative of a color component of the pixel. The system generates a first histogram for aggregate luminance values of the image, and accesses a target histogram for the image representative of a desired global image contrast. The system computes a transfer function based on the first histogram and the target histogram such that when the transfer function is applied, a histogram of the modified aggregate luminance values is within a threshold similarity of the target histogram. The system modifies the image by applying the transfer function to the luminance values of the image to produce a tone mapped image, and outputs the modified image. ) 1. obtaining multiple image frames captured using at least one imaging sensor; / 9. at least one imaging sensor configured to capture multiple image frames; / 17. obtain multiple image frames captured using at least one imaging sensor; ( Douady-Pleven: [ 0029] FIG. 1 is a block diagram illustrating electronic components of a camera 100, according to one embodiment. The camera 100 of the embodiment of FIG. 1 includes one or more processors 102, a system memory 104, a synchronization interface 106, a controller hub 108, one or more microphone controllers 110, an image sensor 112, a lens and focus controller 114, one or more lenses 120, one or more LED lights 122, one or more buttons 124, one or more microphones 126, an I/O port interface 128, a display 130, an expansion pack interface 132, a Bayer scaler 150, a clipping corrector 160, a spatial temporal noise reduction (NR) engine 170, and a global tone mapper 180. [0030] The camera 100 includes one or more processors 102 (such as a microcontroller or a microprocessor) that control the operation and functionality of the camera 100. For instance, the processors 102 can execute computer instructions stored on the memory 104 to perform the functionality described herein. It should be noted that although LUT generation and color model conversion are described herein as performed by the camera 100, in practice, the camera 100 can capture image data, can provide the image data to an external system (such as a computer, a mobile phone, or another camera), and the external system can generate a LUT based on the captured image data. [0098] FIG. 10 illustrates an example high-level block diagram of the spatial temporal noise reduction (NR) engine 170, according to one embodiment. In some embodiments, the spatial temporal noise reduction (NR) engine 170 is a hardware component that processes frames accessed from the processor 102 or image sensor 112. The spatial temporal noise reduction (NR) engine 170 includes an input frames buffer 1000, a weight map 1005, a temporal noise reduction 1010, a spatial noise reduction unit 1015, and a noise reduction blending unit 1030. The spatial temporal noise reduction (NR) engine 170 can include different and/or additional components than those illustrated in the embodiment of FIG. 10 to perform the functionalities described herein. [0099] The input frames buffer 1000 stores a reference image frame received from the processor 102 or image sensor 112. The input frames buffer 1000 also stores one or more image frames that are temporally adjacent to the reference frame. ) 1. generating a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor, / 9. and at least one processing device configured to generate a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor; / 17. generate a local tone map, a global tone map look-up table (LUT), and one or more contrast enhancement LUTs based on at least one of the image frames and one or more parameters of the at least one imaging sensor, ( Douady-Pleven: [ 0099] The input frames buffer 1000 stores a reference image frame received from the processor 102 or image sensor 112. The input frames buffer 1000 also stores one or more image frames that are temporally adjacent to the reference frame. The temporally adjacent frames are frames that were captured prior to (or subsequent to) the capturing of the reference frame. These may have been captured immediately prior to (or subsequent to) or there may have been a short period of time between the captures of these frames. For example, if the reference frame is from a captured video and is frame number X, a temporally adjacent frame may be frame number X+n or X−n, where X and n are integer values. Since the reference image frame and the temporally adjacent image frames are captured within a short period of time of one another, the scenes captured by these frames may include scene objects and captured elements that are shared among the frames. For example, the frames may be from a longer video recording, and so the reference image frame and temporally adjacent image frames may be consecutive image frames of the video capturing a particular scene. [0100] The anti-ghosting unit 1020 determines if a ghosting artifact may exist between a portion of the reference image frame and a corresponding portion of the temporally adjacent image frames stored in the input frame buffer 1000. Both the reference image frame and the temporally adjacent image frames have noise generated from the capturing of the scene. This noise can be caused by both dark noise (noise inherent in the image sensor 112) and Poisson noise (noise due the discrete nature of photons), and may be reduced using noise reduction techniques. For example, the pixel intensity values (i.e., luminance) of the reference image frame and each of the temporally adjacent image frames may be averaged together to produce an averaged pixel intensity value result that reduces the randomly generated noise that differs from frame to frame. If four frames are averaged, the standard deviation of the noise variations in the frames is divided by two, providing an increase in the signal to noise ratio (SNR) of approximately 6 decibels (dB). [0101]-[0104] [0105] In one embodiment, the pixel distance value computation may be represented by the following equation: ) 1. generating a blended and demosaiced image based on the image frames; / 9. generate a blended and demosaiced image based on the image frames; / 17. generate a blended and demosaiced image based on the image frames; (Douady-Pleven:) 1. generating a local tone mapped image based on the blended and demosaiced image and the local tone map; / 9. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; / 17. generate a local tone mapped image based on the blended and demosaiced image and the local tone map; (Douady-Pleven: [ 0042] The input buffer 205 receives and/or accesses the image data from the image sensor 112. FIG. 3 illustrates an exemplary representation of the subpixels 310 on the image sensor 112, arranged via a Bayer filter array, according to an embodiment. Each subpixel 310 includes a photosite that determines the intensity of blue, red, or green wavelength light (i.e., photons) incident upon that photosite. The image sensor includes many subpixels 310, typically in the millions. For each red (“R”) and each blue (“B”) subpixel 310, there are two green subpixels 310 (“Gr” and “Gb”). These four subpixels, when combined, may represent a whole pixel and are typically combined in a demosaicing process implemented to determine a color intensity. Referring back to FIG. 2, the input buffer 205 receives all or a portion of the intensity values of these subpixels 310 from the image sensor 112. ) 1. adjusting color saturation based on the local tone mapped image to generate a corrected image; / 9. adjust color saturation based on the local tone mapped image to generate a corrected image; / 17. adjust color saturation based on the local tone mapped image to generate a corrected image; (Douady-Pleven: [0086]-[0091], [ 0087] The color intensity values of some pixels of the image may already be at the maximum saturation value. Each image has a limited dynamic range, and the maximum saturation value is the maximum value in the dynamic range of an image. For example, if each color channel of an image is 8-bits, the maximum saturation value is 255. When an image is captured (or previously adjusted), some color values at some portions of the image may exceed the maximum saturation value. In this case, these (highlight) portions of the image are displayed as white in color. However, the actual color of the scene captured by the image at that portion of the image may not have been white, but some other color, such as a light blue. The change in color in the captured image is due to the loss of information due to the limitations of the dynamic range changing the ratio of the color intensities at that pixel. While the color values are at a high intensity, this error may be less noticeable since the pixel appears to be near-white. However, if that portion of the image is adjusted with an adjustment value having a multiplicative factor less than one, the color values are decreased, and the incorrect ratio between the red, green, and blue colors is more noticeable, as the resulting pixel is no longer white and may have an unusual or unnatural color unrepresentative of the original color (e.g., light blue or grey if all channels are desaturated identically). [0088] To adjust for this issue, the highlight adjustment corrector 720 performs the desaturation operation by determining whether the adjustment value made to a color intensity value of a pixel, or to all three color intensity values of the pixel, is greater than a corrected adjustment value equal to two times the input color value subtracted by the maximum saturation value. In other words, the highlight adjustment corrector 720 determines: out=max(adjust*in, 2(in)−max_saturation) (8) [0 091] FIG. 8 illustrates exemplary color curves applied to an image, according to an embodiment. The saturation adjustment curve 850 illustrates the result of adjusting the color levels of an image to be larger. Some values will reach a saturation point 820, and thus be limited by the maximum dynamic range of the image. However, other values are instead set to a value that is lower than the maximum saturation value. In the illustrated exemplary color curve 850, the highlight adjustment corrector 720 generates a curve that tapers off exponentially such that although high intensity values are near the saturation point, the high intensity values gradually reach the maximum saturation point. ) 1. and generating an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. / 9. and generate an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. / 17. and generate an output image based on the corrected image, the global tone map LUT, and the one or more contrast enhancement LUTs. (Douady-Pleven: [0131]-[0138], Figure 12, [0131] FIG. 12 illustrates an example high-level block diagram of the global tone mapper 180, according to one embodiment. In some embodiments, the global tone mapper 180 is a hardware component that processes images accessed from the processor 102 or image sensor 112. The global tone mapper 180 includes a histogram module 1205, an automatic white balance module 1210, an auto-exposure correction module 1215, a target histogram determination module 1220, a histogram transfer module 1225, and an unsharp mask module 1230. The global tone mapper 180 can include different and/or additional components than those illustrated in the embodiment of FIG. 12 to perform the functionalities described herein [0136] The auto-exposure correction module 1215 corrects the image for any undesired auto-exposure. The image may have been over- or under-exposed during capture by the camera 100. In particular, the auto-exposure mechanism of the camera 100 may have underexposed the region of interest of the image in order to preserve the highlights in the image such that the highlight areas stay with the dynamic range of the camera 100. [0137] In one embodiment, to compensate for these exposure errors, the auto-exposure correction module 1215 uses a look up table (LUT) that provides corrected output luminance values for input luminance values without clipping highlight information. The LUT is generated based on a tone curve generated by concatenating a linear curve and a Bezier curve. The slope of the linear curve is based on the amplification to apply to the image to better expose the relevant regions of interest (ROI). This slope may be equal to an exposure bias of 2 ev (2 stops). The Bezier curve portion is based on the following equation: ) It would have been obvious before the effective filing date of the claimed invention was filed to one of ordinary skill in the art to modify Dey’s method and system for high resolution HDR imaging analysis with the global tone mapping algorithm of Douady-Pleven, as they are both directed towards methods for image analysis and processing. The determination of obviousness is predicated upon the following findings: One skilled in the art would have been motivated to modify the HDR imaging of Dey with the algorithm of Douady-Pleven for ensuring the incorporation of “ a color filter array scaler, temporal and spatial video noise reduction, the prevention of highlight clipping in video, and video global tone mapping ” for accuracy and image enhancement (Douady-Pleven: [0002]) . Furthermore, the prior art collectively includes each element claimed (though not all in the same reference), and one of ordinary skill in the art could have combined the elements in the manner explained above using known engineering design, interface and/or programming techniques, without changing a “fundamental” operating principle of Dey , while the teaching of Douady-Pleven continues to perform the same function as originally taught prior to being combined, in order to produce the repeatable and predictable result of leveraging known image processing and enhancement techniques for HDR images. It is for at least the aforementioned reasons that the examiner has reached a conclusion of obviousness with respect to the claim in question. Consider Claims 2 and 10. The combination of Dey and Douady-Pleven teaches: 2. The method of Claim 1, wherein generating the local tone map comprises; selecting one of the image frames; generating a gain map based on the selected image frame; dividing the gain map into multiple tiles; and generating a three-dimensional (3D) LUT using the tiles. / 10. The electronic device of Claim 9, wherein, to generate the local tone map, the at least one processing device is configured to: select one of the image frames; generate a gain map based on the selected image frame; divide the gain map into multiple tiles; and generate a three-dimensional (3D) LUT using the tiles. (Douady-Pleven: [0148]-[ 0149] After computing the transfer function, in one embodiment, the histogram transfer module 1225 transposes the transfer function to a color space where a gamma curve of the image has not yet been applied. Given the transfer function T(x), in one embodiment the transposed transfer function T′(x)=TC−1(T(TC(x))), where TC is the gamma curve. The histogram transfer module 1225 applies a gain based on the transposed transfer function to the color values of the image. This new gain is defined as gfinal(x)=(g(Taec(x))*gaec(x), where Taec is the auto-exposure adjustment curve and gaec(x)*x=Taec(x). In one embodiment, the auto-exposure adjustment is not performed, and so the histogram transfer module does not transpose the transfer function and applies it to the image as x′=x*g(Y(R, G, B)), where x is one of R, G or B channels, Y is the luminance function of the given pixel, and g(x)=T′(x)/x. The histogram transfer module 1225 may convert the image with the gain applied back to the sRGB color space by applying the gamma curve to it. In one embodiment, the histogram transfer module 1225 further modifies (e.g., by multiplication) the gain of the transposed transfer function based on the transfer function computed for the prior frame (i.e., a temporal filter is applied). The histogram transfer module 1225 may also apply a spatial 5×5 filter on the gain map. Prior to applying the spatial 5×5 filter, the gain g(Y(x)) may be filtered to enhance small details. ) Consider Claims 3, 11 and 18. The combination of Dey and Douady-Pleven teaches: 3. The method of Claim 2, wherein generating the gain map comprises: down-sampling the selected image frame to generate a down-sampled image frame, converting the down-sampled image frame into a single luma channel image; applying local tone mapping to the single luma channel image to generate a tone mapped image; and determining gain values of the gain map based on ratios of values in the tone mapped image and values in the single luma channel image. / 11. The electronic device of Claim 10,wherein, to generate the gain map, the at least one processing device is configured to: down-sample the selected image frame to generate a down-sampled image frame; convert the down-sampled image frame into a single luma channel image; apply local tone mapping to the single luma channel image to generate a tone mapped image; and determine gain values of the gain map based on ratios of values in the tone mapped image and values in the single luma channel image. / 18. The non-transitory machine readable medium of Claim 17, wherein: the instructions that when executed cause the at least one processor to generate the local tone map comprise instructions that when executed cause the at least one processor to: select one of the image frames; generate a gain map based on the selected image frame; divide the gain map into multiple tiles; and generate a three-dimensional (3D) LUT using the tiles; the instructions that when executed cause the at least one processor to generate the gain map comprise instructions that when executed cause the at least one processor to: down-sample the selected image frame to generate a down-sampled image frame; convert the down-sampled image frame into a single luma channel image; apply local tone mapping to the single luma channel image to generate a tone mapped image; and determine gain values of the gain map based on ratios of values in the tone mapped image and values in the single luma channel image; and the instructions that when executed cause the at least one processor to generate the one or more contrast enhancement LUTs comprise instructions that when executed cause the at least one processor to generate the one or more contrast enhancement LUTs based on the tone mapped image. (Douady-Pleven: [0094] Initially, the clipping corrector 160 accesses 905 the image from image sensor. The clipping corrector 160 determines 901, for each color channel of each portion of the image, a corresponding adjustment to apply to the color channel to correct for a color irregularity. These may be color irregularities such as white balance issues and lens shading irregularities as described above. [0095] The clipping corrector 160 determines 915 a corrected adjustment value based on a difference between twice the pixel value and the maximum saturation value. The maximum saturation value is the maximum value of the dynamic range supported by an image. For an image with an 8-bit color channel, this is 256 bits for each color. [0096] The clipping corrector 160 determines 920 whether the result of the adjustment is larger than the corrected adjustment value. If so, the clipping corrector 160 applies 925 the adjustment to the corresponding color channel of the image portion to produce the adjusted color channel. In other words, the adjustment is applied to the intensity value of the pixel di
Read full office action

Prosecution Timeline

Oct 12, 2023
Application Filed
Feb 27, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586249
PROCESSING APPARATUS, PROCESSING METHOD, AND STORAGE MEDIUM FOR CALIBRATING AN IMAGE CAPTURE APPARATUS
2y 5m to grant Granted Mar 24, 2026
Patent 12586354
TRAINING METHOD, APPARATUS AND NON-TRANSITORY COMPUTER READABLE MEDIUM FOR A MACHINE LEARNING MODEL
2y 5m to grant Granted Mar 24, 2026
Patent 12573083
COMPUTER-READABLE RECORDING MEDIUM STORING OBJECT DETECTION PROGRAM, DEVICE, AND MACHINE LEARNING MODEL GENERATION METHOD OF TRAINING OBJECT DETECTION MODEL TO DETECT CATEGORY AND POSITION OF OBJECT
2y 5m to grant Granted Mar 10, 2026
Patent 12548297
IMAGE PROCESSING METHOD AND APPARATUS, COMPUTER DEVICE, STORAGE MEDIUM, AND PROGRAM PRODUCT BASED ON FEATURE AND DISTRIBUTION CORRELATION
2y 5m to grant Granted Feb 10, 2026
Patent 12524504
METHOD AND DATA PROCESSING SYSTEM FOR PROVIDING EXPLANATORY RADIOMICS-RELATED INFORMATION
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
99%
With Interview (+17.9%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 868 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month