DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55 (European Application 23218634.6 filed December 20th, 2023).
Information Disclosure Statement
The information disclosure statement (IDS) submitted on November 22nd, 2024 was filed before the mailing date of the First Action on the Merits (this Office Action). The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the Examiner.
Specification
The listing of references in the specification is not a proper information disclosure statement. 37 CFR 1.98(b) requires a list of all patents, publications, or other information submitted for consideration by the Office, and MPEP § 609.04(a) states, "the list may not be incorporated into the specification but must be submitted in a separate paper." Therefore, unless the references have been cited by the examiner on form PTO-892, they have not been considered.
The disclosure is objected to because of the following informalities:
In Paragraph 29 line 2, reference character “110” should read as --S110-- to be consistent with Figure 1 which is being described.
In Paragraph 53, reference character 226 is missing from the list of reference characters and description / labels given in listing the elements in Figure 2.
Appropriate correction is required.
Claim Interpretation – Functional Analysis
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that use the word “means” or “step” or a generic placeholder but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function.
Such claim limitation(s) is/are: “video processing system having a processing capability” in claim 8.
Such claim limitation(s) is/are: “video processing system configured for updating a buffer” in claim 9.
Such claim limitation(s) is/are: “circuitry configured to …” in claim 9.
First, the Examiner notes one of ordinary skill in the art would readily understand the claimed “circuitry” as connoting sufficient structure and the limitation(s) not invoking Functional Analysis.
Second, in view of Specification Paragraph 49, the “video processing system” positively comprises at least “circuitry” and thus the descriptions of the circuitry as processors / memories in Specifications Paragraphs 50 – 52 and thus contains the “buffers” claimed and processors / structures to perform claimed instructions / functions. Thus, the “video processing system” comprises of only elements that connote sufficient structure and thus the claimed “video processing system” connotes sufficient structure.
Because this/these claim limitation(s) is/are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof.
If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function.
Claim Objections
Claims 9 – 15 are objected to because of the following informalities:
Regarding claim 9, the labels “first”, “second”, and “third”, and the like are used throughout the claim and not necessarily in the order of the steps of the algorithm (e.g. the “second obtaining function” is missing). The labels / numbering are generally not afforded patentable weight as an extra limitation as the feature raise Indefinite metes and bounds issues. For purposes of examination, the numbering / ordinal numbers are not afforded patentable weight other than distinction in steps (MPEP2111.03 I).
Regarding claims 10 – 15, the dependent claims do not cure the deficiencies of independent claim 9 from which they depend and thus are similarly Objected.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1 – 15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 1, the claim appears to be a method claim, but recites the preamble “in a video processing system for updating a buffer” making the claim have Indefinite metes and bounds regarding (a) patentable weight to afford the preamble which usually is not afforded patentable weight and (b) the statutory category of the claim as either a method claim or a functionally claimed apparatus.
Regarding claims 2 – 7, the dependent claims do not cure the deficiencies of independent claim 1 from which they depend and thus are similarly Rejected.
Regarding claim 5, the claim requires “obtaining a measure of amount of noise” which has Indefinite metes and bounds how the “obtaining” is to be done as the computation is not previously done to “obtain” instead of the more generally accepted “determining a measure …” limitation in claim 1 which is generally afforded definite metes and bounds.
Regarding claim 9, the claim recites “using a filtering algorithm” with only an intended use and no description of the type of filtering to be done or which class of algorithms are to be used and thus the claim has Indefinite metes and bounds.
Regarding claim 9, the claim recites “to reduce temporal noise” (multiple times) which is an intended use and thus the claim has Indefinite metes and bounds regarding patentable weight to afford the claim.
Regarding claim 1, see claim 9 which is the apparatus performing the steps of the claimed method and thus is similarly Rejected.
Regarding claim 8, see claim 9 which is the apparatus performing the steps of the claimed program and thus is similarly Rejected.
Regarding claims 2 – 7 and 10 – 15, the dependent claims do not cure the deficiencies of their respective independent claim from which they depend and thus are similarly Rejected.
Regarding claims 3 and 11, the claims recite “should be” which gives the claims Indefinite metes and bounds as “should” is not definitive language and is condition at best, but not conditions are given for the multiplication to be performed.
Claim 11 recites the limitation "the probability" in line 4. There is insufficient antecedent basis for this limitation in the claim.
Claim 11 recites the limitation "the scene" in line 6. There is insufficient antecedent basis for this limitation in the claim.
Regarding claim 14, the claim depends on claim 8 which is the program of the functions performed on the system of claim 9 and thus claim 14 is Indefinite as to the statutory category of Invention the claim is supposed to be.
Should the claim depend on claim 9 instead of claim 8 then the Rejection will be overcome. For purposes of Examination in the Prior Art Rejections, claim 14 will be treated as dependent on claim 9 as an apparatus claim instead of a program.
The following is a quotation of 35 U.S.C. 112(d):
(d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph:
Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
Claims 11 and 14 are rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends.
Regarding claim 11, through the various lack of antecedent basis with the probabilities claimed (which are in claim 10), as claim 11 parallels claim 3, claim 11 does not depend on claim 10 (as claim 3 depends on claim 2).
Regarding claim 14, the claim does not properly limit claim 8 from which it depends as it changes the statutory class of claim 8.
Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 – 15 are rejected under 35 U.S.C. 103 as being unpatentable over Hoang, et al. (US Patent #7,711,044 B1 referred to as “Hoang” throughout) [Cited in the Applicant’s November 22nd, 2024 IDS], and further in view of Ishii (US PG PUB 2012/0262598 A1 referred to as “Ishii” throughout).
Regarding claim 1, see claim 9 which is the apparatus performing the steps of the claimed method.
Regarding claim 2, see claim 10 which is the apparatus performing the steps of the claimed method.
Regarding claim 3, see claim 11 which is the apparatus performing the steps of the claimed method.
Regarding claim 4, see claim 12 which is the apparatus performing the steps of the claimed method.
Regarding claim 5, see claim 13 which is the apparatus performing the steps of the claimed method.
Regarding claim 6, see claim 14 which is the apparatus performing the steps of the claimed method.
Regarding claim 7, see claim 15 which is the apparatus performing the steps of the claimed method.
Regarding claim 8, see claim 9 which is the apparatus performing the steps of the claimed program.
Regarding claim 9, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
circuitry [Hoang Figure 1 (see at least reference characters 204, 100, and 220) as well as Hoang Column 4 lines 38 – 62 (ASIC / circuit implementations) and Column 3 line 45 – Column 4 line 37 (hardware / circuit / processor implementations)] configured to execute:
a first obtaining function configured to obtain, from the buffer [Hoang Figures 3 – 4 (see at least reference characters 252, 266, 272, and 269 (various memories / buffers)], a stored set of pixel values for a set of one or more pixels relating to previous video frames in a sequence of video frames, wherein the stored set of pixel values has been filtered using a filtering algorithm to reduce temporal noise [Hoang Figures 3 – 4 and 6 (see at least reference characters 254, 256, 262 (temporal filter), 268 (filters), 266, 269, and 272 (memory / buffers) as well as Column 3 lines 13 – 29 (temporal filtering of pixels for inter prediction / motion compensation), Column 5 line 31 – Column 6 line 26 (combining previous / current pixels from filtering / managing stored pixels of video data frames / video frames), Column 8 lines 31 – 67 (temporal filter memory managed), Column 9 line 54 – Column 10 line 19 (video frame based approach to processing and noise estimation / computation / reduction techniques with the filtering in Figures 3 – 5)];
a third obtaining function configured to obtain, from a sensor, a current set of pixel values for the set of one or more pixels in a current video frame [Hoang Figures 1 – 2 as well as Column 8 line 59 – Column 9 line 30 (video camera capturing input (combinable with Ishii Figures 7, 12, or 17 (see at least reference characters 1701, 1703, 1309, and 1720)) with current / previous frame considerations) and Column 9 line 31 – Column 10 line 19 (video frame based approach to processing for pixels in memory)];
a first determining function configured to determine a new set of pixel values for the set of one or more pixels for storing in the buffer based on the stored set of pixel values and the current set of pixel values using the filtering algorithm to reduce temporal noise [Hoang Figures 3 – 4 and 6 (see at least reference characters 254, 256, 262 (temporal filter), 268 (filters), 266, 269, and 272 (memory / buffers) as well as Column 3 lines 13 – 29 (temporal filtering of pixels for inter prediction / motion compensation), Column 5 line 31 – Column 6 line 26 (combining previous / current pixels from filtering / managing stored pixels of video data frames / video frames), Column 8 lines 31 – 67 (temporal filter memory managed), Column 9 line 54 – Column 10 line 19 (video frame based approach to processing and noise estimation / computation / reduction techniques with the filtering in Figures 3 – 5 or filtering / adjustments as suggested in Ishii Figures 3 – 6 and 10 – 11 (filter coefficient setting) and 15 – 16 (effects on noise / reductions over time due to filtering such as in Paragraphs 63 – 69 (temporal filter pixels in memory / buffer)))];
a second determining function configured to determine a measure of amount of noise in the new set of pixel values [Hoang Column 9 line 54 – Column 10 line 67 (noise estimation is a MAD / absolute difference computation where in Column 10 lines 25 – 67 related MAD to filter level / gain / strength to use) and Column 11 line 28 – Column 12 line 50 (SAD computations and assessments of motion level as related noise metrics)];
a quantizing function configured to quantize the new set of pixel values based on the measure of amount of noise in the new set of pixel values [See next limitation for Ishii citations in combination with Hoang Figures 6 – 7 (see at least reference characters 290 and 292) as well as Column 14 lines 9 – 67 (frequency control of quantization based on noise estimates / levels present where gain scalars may affect quantization parameters for each pixel too further elaborated in Column 15 lines 1 – 29)], wherein the higher the measure of amount of noise in the new set of pixel values, the higher quantizing is performed [See previous limitation for Hoang citations in combination with Ishii Figures 8 and 13 – 16 as well as Paragraphs 61 – 66 (gain / scale control based on noise / undesired effects imaging), 76 – 78 and 97 – 102 (quantization scale control / ratio of quantization factor to use changed based on noise / excessive changes in pixels captured temporally; increases to address flickering noise)];
a compressing function configured to compress the quantized new set of pixel values [Ishii Figures 7, 12, and 17 (see at least reference character 711 or 713 (compressing / compressing by entropy coding) and 714 (buffer)) as well as Paragraphs 68 – 76 or 93 – 102 (buffer storing compressed values (e.g. transform (combinable with Hoang Figures 6 – 7 (see at least reference character 280 or 286) as well as Column 14 lines 11 – 67 (DCT used as the transform) and Column 16 lines 5 – 28 (DCT and quantization)) or entropy coded) in embodiments in Figures 7 or 12)]; and
an updating function configured to update the buffer with the compressed quantized new set of pixel values [Hoang Figures 2 – 4 (see use of current / previous fields and modify the buffers with each new frame at least and see at least the blender in reference character 258) and 6 (filtered pixels are quantized and transformed / coded see at least the “to 224” branch where the loop / updating memories is further rendered obvious in Ishii Figures 7, 12, and 17 (see loop going into inter prediction / motion compensation module)) as well as Column 7 lines 2 – Column 8 line 41 (Column 7 lines 2 – 25 updating / combining previous and current pixels into the buffer / updating / overwriting; Column 7 line 59 – Column 8 line 41 rewriting / computing current pixels based on filtering / noise levels as in Column 8 lines 42 – 67)].
The motivation to combine Ishii with Hoang is to combine features in the same / related field of invention of processing photographed images in a compression scheme using inter prediction [Ishii Paragraph 1] in order to improve camera gain and reduce flickering in video capture [Ishii Paragraphs 1 – 3 and 10 – 12 where the Examiner observes at least KSR Rationales (D) or (F) are also applicable].
This is the motivation to combine Hoang and Ishii which will be used throughout the Rejection.
Regarding claim 10, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
a third determining function configured to determine a measure of probability that a change in the current set of pixel values in relation to the stored set of pixel values is due to changes in the scene [Hoang Figures 3 – 6 as well as Column 7 line 2 – Column 8 line 41 (the “percentage” is an obvious variant of the claimed “probability” to one of ordinary skill in the art where the D and M values computes are percentages / probabilities of motion between frames and further Column 8 lines 1 – 41 and the operating principle in Column 8 lines 42 – 60 the amount of motion computed and desired to use small / large percentages of previous frames renders obvious the computations claimed further Column 11 line 28 – Column 12 line 56 (see equations 4 – 8) compute level / probability of motion between frames)],
wherein, in the first determining function, the new set of pixel values for storing in the buffer are determined further based on the measure of probability that a change in the current set of pixel values in relation to the stored set of pixel values is due to changes in the scene [See claim 9 for citations regarding the “first determining function …” limitation and additionally Hoang Figures 3 – 6 as well as Column 7 line 2 – Column 8 line 41 (the “percentage” is an obvious variant of the claimed “probability” to one of ordinary skill in the art where the D and M values computes are percentages / probabilities of motion between frames and further Column 8 lines 1 – 41 and the operating principle in Column 8 lines 42 – 60 the amount of motion computed and desired to use small / large percentages of previous frames renders obvious the computations claimed further Column 11 line 28 – Column 12 line 56 (see equations 4 – 8) compute level / probability of motion between frames)].
See claim 9 for the motivation to combine Hoang and Ishii.
Regarding claim 11, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
wherein the first determining function is further configured to [See claim 9 for citations regarding the “first determining function …” limitation and additional citations in the following limitations]
determine a first weight by which the stored set of pixel values should be multiplied and a second weight by which the current set of pixel values should be multiplied to produce new set of pixel values, wherein the higher the probability that a change in the current set of pixel values in relation to the stored set of pixel values is due to changes in the scene, the lower the first weight is in relation to the second weight [See claim 10 for citations of “probability” claimed and additionally Hoang Figures 6 – 7 (see at least reference characters 290 and 292) as well as Column 7 line 2 – Column 8 line 41 (weights / gain / filter coefficients a function of motion changes / probability of scene change), Column 13 line 31 – Column 14 line 8 (filter coefficients (e.g. IIR coefficients) a function of weights based on noise / smoothing effect desired), Column 14 lines 9 – 67 (frequency control of quantization based on noise estimates / levels present where gain scalars may affect quantization parameters for each pixel too further elaborated in Column 15 lines 1 – 29 based on noise); Ishii Figures 8 and 13 – 16 as well as Paragraphs 61 – 66 (gain / scale control based on noise / undesired effects imaging), 76 – 78 and 97 – 102 (quantization scale control / ratio of quantization factor to use changed based on noise / excessive changes in pixels captured temporally; increases to address flickering noise)].
See claim 9 for the motivation to combine Hoang and Ishii.
Regarding claim 12, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
a second obtaining function configured to obtain a measure of an amount of noise in the stored set of pixel values [Hoang Column 9 line 54 – Column 10 line 67 (noise estimation is a MAD / absolute difference computation where in Column 10 lines 25 – 67 related MAD to filter level / gain / strength to use) and Column 11 line 20 – Column 12 line 50 (SAD computations and assessments of motion level as related noise metrics and see at least equation (3) for comparing previous noise / distortion to the current distortion / noise)],
wherein the higher the measure of amount of noise in the stored set of pixel values, the lower the first weight is in relation to the second weight [Hoang Figures 6 – 7 (see at least reference characters 290 and 292) as well as Column 7 line 2 – Column 8 line 41 (weights / gain / filter coefficients a function of motion changes / probability of scene change and low weights for less smoothing as desired), Column 11 lines 35 – 67 (low weights for high noise / motion), Column 13 line 31 – Column 14 line 8 (filter coefficients (e.g. IIR coefficients) a function of weights based on noise / smoothing effect desired), Column 14 lines 9 – 67 (frequency control of quantization based on noise estimates / levels present where gain scalars may affect quantization parameters for each pixel too further elaborated in Column 15 lines 1 – 29 based on noise); Ishii Figures 8 and 13 – 16 as well as Paragraphs 61 – 66 (gain / scale control based on noise / undesired effects imaging), 76 – 78 and 97 – 102 (quantization scale control / ratio of quantization factor to use changed based on noise / excessive changes in pixels captured temporally; increases to address flickering noise)].
See claim 9 for the motivation to combine Hoang and Ishii.
Regarding claim 13, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
a second obtaining function configured to obtain a measure of amount of noise in the stored set of pixel values [Hoang Column 9 line 54 – Column 10 line 67 (noise estimation is a MAD / absolute difference computation where in Column 10 lines 25 – 67 related MAD to filter level / gain / strength to use) and Column 11 line 20 – Column 12 line 50 (SAD computations and assessments of motion level as related noise metrics and see at least equation (3) for comparing previous noise / distortion to the current distortion / noise)],
wherein, in the first determining function, determining the new set of pixel values for storing in the buffer is further based on the measure of amount of noise in the stored set of pixel values [Hoang Figures 2 – 4 (see use of current / previous fields and modify the buffers with each new frame at least and see at least the blender in reference character 258) and 6 – 7 (filtered pixels are quantized and transformed / coded see at least the “to 224” branch where the loop / updating memories is further rendered obvious in Ishii Figures 7, 12, and 17 (see loop going into inter prediction / motion compensation module)) as well as Column 7 lines 2 – Column 8 line 41 (Column 7 lines 2 – 25 updating / combining previous and current pixels into the buffer / updating / overwriting; Column 7 line 59 – Column 8 line 41 rewriting / computing current pixels based on filtering / noise levels as in Column 8 lines 42 – 67 combinable with Ishii Paragraph 64) and Column 14 lines 9 – 67 (frequency control of quantization based on noise estimates / levels present where gain scalars may affect quantization parameters for each pixel too further elaborated in Column 15 lines 1 – 29)].
See claim 9 for the motivation to combine Hoang and Ishii.
Regarding claim 14, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
wherein the buffer is an infinite impulse response (IIR) buffer, and wherein the filter algorithm used to reduce temporal noise is IIR filtering [Hoang Figures 3 – 6 as well as Column 1 lines 55 – 67 (IIR temporal filter with buffer considerations) and Column 13 lines 10 – 61 (see at least equation 12); Ishii Figure 6 and Paragraph 64].
See claim 9 for the motivation to combine Hoang and Ishii.
Regarding claim 15, Hoang teaches pixel buffer management techniques with temporal considerations to select filters and gains to use on pixels. Ishii teaches noise and quantization relationships of pixels captured in comparing changes between images / sensor capture times.
It would have been obvious to one of ordinary skill art before the effective filing date of the claimed invention to modify the gain, filtering, and quantization functions / parameters of Hoang in processing imaged pixels with considerations as taught by Ishii in comparing / improving noise in the captured images / sensor grabs. The combination teaches
retrieve, from the buffer, a compressed stored set of pixel values for the set of one or more pixels relating to previous video frames in the sequence of video frames, wherein the compressed stored set of pixel values has been filtered using a filtering algorithm to reduce temporal noise [Hoang Figures 2 – 4 (see use of current / previous fields and modify the buffers with each new frame at least and see at least the blender in reference character 258) and 6 (filtered pixels are quantized and transformed / coded see at least the “to 224” branch where the loop / updating memories is further rendered obvious in Ishii Figures 7, 12, and 17 (see loop going into inter prediction / motion compensation module)) as well as Column 7 lines 2 – Column 8 line 41 (Column 7 lines 2 – 25 updating / combining previous and current pixels into the buffer / updating / overwriting; Column 7 line 59 – Column 8 line 41 rewriting / computing current pixels based on filtering / noise levels as in Column 8 lines 42 – 67); Ishii Figures 1, 7, 12, and 17 (see at least reference characters 111, 711, 1211, or 1711 (DCT transform / compression), 120, 720, 1220, and 1720 (memory to save data too)) as well as Paragraphs 44 – 50, 72 – 77, and 93 – 98 (store quantized / transformed coefficients to video memory to later decompress / inverse transform for inter prediction / motion detection)]; and
decompress the compressed stored set of pixel values to obtain the stored set of pixel values [See previous limitation and additionally Ishii Figures 7, 12, and 17 (see at least reference characters 118, 718, 1218, or 1718 (inverse transform / compression), 120, 720, 1220, and 1720 (memory to save data too)) as well as Paragraphs 9, 50, 73, and 94 (decompress / inverse transform)].
See claim 9 for the motivation to combine Hoang and Ishii.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tyler W Sullivan whose telephone number is (571)270-5684. The examiner can normally be reached IFP.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Czekaj can be reached at (571)-272-7327. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TYLER W. SULLIVAN/ Primary Examiner, Art Unit 2487