DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 2, 3, 9, 14, 15, 16, 19, and 20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Pan et al. (“Effect of camera temperature variations on stereo-digital image correlation measurements,” Applied Optics, Vol. 54, No. 34, December 1 2015), hereinafter referred to as Pan.
Regarding claim 1, Pan teaches a method comprising:
determining that a first image and a second image generated using an image sensor depict a same static scene (Pan Fig. 1: “CCD cameras”; Pan pg. 10090 right column: “the stereo-DIC system was placed about 730 mm in the front of the test sample, with the two cameras aligned at a proper angle to ensure that the region of interest can be fully imaged by both cameras”); and
responsive to the determining, determining a noise estimate for the image sensor based at least on a difference between first values of a first subset of pixels of the first image and second values of a corresponding second subset of pixels of the second image (Pan Fig. 2: “Region of Interest”; Pan Table I: “Relative Error”; Pan Fig. 4 & pg. 10092 left column: “The relative percentage errors listed in the last column show the slight changes due to temperature change in the intrinsic geometry of the two cameras …displacement fields of the last image pair recorded 5h after imitating the image-recording process”).
Regarding claim 2, Pan teaches the method of claim 1, further comprising:
determining one or more conditions of the image sensor associated with the first image and the second image, wherein the noise estimate is associated with the one or more conditions of the image sensor (Pan pg. 10092 left column discussed above teaches measuring temperature changes for the two cameras; also see Pan Fig. 1: “Temperature sensors” & pg. 10090 left column: “to investigate the effect of camera temperature variations on stereo-DIC measurements”).
Regarding claim 3, Pan teaches the method of claim 2, wherein the one or more conditions of the image sensor comprise temperature data measured using one or more temperature sensors associated with the image sensor (Pan Fig. 1, pg. 10090 left column, & pg. 10092 left column discussed above).
Regarding claim 9, Pan teaches the method of claim 1, wherein the determining that the first image and the second image generated using the image sensor depicts the same static scene comprises:
receiving an indication that the image sensor was in a motionless state between generation of the first image and the second image (Pan Fig. 1: the cameras are motionless and stationary in the experimental setup; Pan pg. 10092 right column: “Since both the test reference object and the camera bodies were tightly fixed, the reason for the image motions can only be attributed to the movements of the camera target sensors due to thermal expansion of the mechanical components in the two cameras”).
Regarding claim 14, Pan teaches a system comprising:
an image sensor; and a processing device, coupled to the image sensor (Pan Fig. 1 & Abstract: “stereo-DIC system”; Pan pg. 10091 left column: “images pairs were subsequently processed using the commercial software PMLBAB DIC-3D”), the processing device to perform the method described in claim 1.
Therefore, claim 14 is rejected using the same rationale as applied to claim 1 discussed above.
Claim 15 is rejected using the same rationale as applied to claim 2 discussed above.
Claim 16 is rejected using the same rationale as applied to claim 3 discussed above.
Regarding claim 19, Pan teaches the system of claim 14, wherein the system is comprised in at least one of (Note that only one of the alternative limitations is required by the claim language):
a control system for an autonomous or semi-autonomous machine;
a perception system for an autonomous or semi-autonomous machine;
a system for performing one or more simulation operations;
a system for performing one or more digital twin operations;
a system for performing light transport simulation;
a system for performing collaborative content creation for 3D assets;
a system for performing one or more deep learning operations;
a system for presenting at least one of augmented reality content, virtual reality content, or mixed reality content;
a system for hosting one or more real-time streaming applications;
a system implemented using an edge device;
a system implemented using a robot;
a system for performing one or more conversational AI operations;
a system implementing one or more language models;
a system implement one or more large language models (LLMs);
a system for performing one or more generative AI operations;
a system for generating synthetic data (Pan Fig. 4: virtual displacement fields are generated);
a system incorporating one or more virtual machines (VMs);
a system implemented at least partially in a data center; or
a system implemented at least partially using cloud computing resources.
Regarding claim 20, Pan teaches one or more processors comprising circuitry to perform the method described in claim 1 (Pan Fig. 1, Abstract, & pg. 10091 left column discussed above).
Therefore claim 20 is rejected using the same rationale as applied to claim 1 discussed above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 4, 13, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Pan et al. (Applied Optics, Vol. 54, No. 34, December 1 2015), in view of de Souza Bachour et al. (“Skin-color independent robust assessment of capillary refill time,” J Biophotonics, 2023 Nov;16(11):e202300063. doi: 10.1002/jbio.202300063. Epub 2023 Aug 10), hereinafter referred to as Pan and de Souza Bachour, respectively.
Regarding claim 4, Pan teaches the method of claim 1, but does not appear to explicitly teach computing an average pixel intensity value of the first subset of pixels and the second subset of pixels, wherein the noise estimate is associated with the average pixel intensity value.
Pertaining to the same field of endeavor, de Souza Bachour teaches computing an average pixel intensity value of the first subset of pixels and the second subset of pixels, wherein the noise estimate is associated with the average pixel intensity value (de Souza Bachour pg. 3 left column: “The average intensities of the R, G, and B channels of the ROI pixels are calculated for each frame (Figure 2A) and the G-channel (Figure 2A) presents the best signal-to-noise ratio”).
Pan and de Souza Bachour are considered to be analogous art because they are directed to image processing. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system and method for determining the effect of camera temperature variations on stereo DIC measurements (as taught by Pan) to calculate average pixel intensity (as taught by de Souza Bachour) because the combination can determine the best SNR (de Souza Bachour pg. 3 left column).
Regarding claim 13, Pan teaches the method of claim 1, but does not appear to explicitly teach applying one or more noise reduction algorithms to reduce noise associated with at least the first image or the second image based at least on the noise estimate.
Pertaining to the same field of endeavor, de Souza Bachour teaches applying one or more noise reduction algorithms to reduce noise associated with at least the first image or the second image based at least on the noise estimate (de Souza Bachour Abstract: “An adaptive algorithm identifies the optimal regression region for noise reduction”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system and method for determining the effect of camera temperature variations on stereo DIC measurements (as taught by Pan) to reduce the noise (as taught by de Souza Bachour) because the combination provides benefits for reliable and reproducible quantitative methods for various image applications (de Souza Abstract).
Claim 17 is rejected using the same rationale as applied to claim 4 discussed above.
Claim(s) 10, 11, and 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Pan et al. (Applied Optics, Vol. 54, No. 34, December 1 2015), in view of Bilbrey et al. (US 2007/0071342 A1), hereinafter referred to as Pan and Bilbrey, respectively.
Regarding claim 10, Pan teaches the method of claim 1, wherein the determining that the first image and the second image generated using the image sensor depict the same static scene comprises:
determining an amount of pixel displacement between the first image and the second image (Pan Fig. 4 & pg. 10092 left column discussed above); and
determining that the amount of pixel displacement (Pan Fig. 4 & pg. 10092 left column discussed above).
However, Pan does not appear to explicitly teach determining that the amount of pixel displacement is below a threshold amount of pixel displacement.
Pertaining to the same field of endeavor, Bilbrey teaches determining that the amount of pixel displacement is below a threshold amount of pixel displacement (Bilbrey ¶¶0019: “Blending of pixel data is performed only for pixels having motion below a selected threshold. Compensation for gain levels associated with pixels having motion above the selected threshold is performed”; Bilbrey ¶¶0059: “ancillary data relating to operating temperature can be used to determine whether the image sensor and/or other components are subject to temperature outside of predetermined thermal specifications, which can cause some fixed pattern noise in the detected images”).
Pan and Bilbrey are considered to be analogous art because they are directed to image processing. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system and method for determining the effect of camera temperature variations on stereo DIC measurements (as taught by Pan) to compare the displacement to a threshold (as taught by Bilbrey) because the combination can determine which pixels contribute to the pattern noise (Bilbrey ¶¶0019, ¶¶0059).
Regarding claim 11, Pan teaches the method of claim 1, but does not appear to explicitly teach that the image sensor is a high dynamic range (HDR) sensor.
Pertaining to the same field of endeavor, Bilbrey teaches that the image sensor is a high dynamic range (HDR) sensor (Bilbrey ¶¶0080: “detected using the high gain level, the expanded dynamic range frame”; Bilbrey ¶¶0085: “the camera is again switched into the expanded dynamic range mode, and new high gain and low gain levels are calculated to allow the gain levels to adapt to the new lighting conditions”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system and method for determining the effect of camera temperature variations on stereo DIC measurements (as taught by Pan) to use an HDR sensor (as taught by Bilbrey) because the combination provides more flexibility and adapts to new environment (Bilbrey ¶¶0085).
Regarding claim 12, Pan teaches the method of claim 1, wherein the determining the noise estimate is performed while the image sensor is deployed in an environment (Pan Figs. 1, 4-5)
However, Pan does not appear to explicitly teach that the image sensor is operating on a machine.
Pertaining to the same field of endeavor, Bilbrey teaches that the image sensor is operating on a machine deployed in an environment (Bilbrey ¶¶0003: “Techniques and systems can be implemented to improve the quality of images detected by image sensors to improve the quality of the detected images”; Bilbrey ¶¶0047: “camera settings and/or other information relating to the state of the camera and/or the environment during or at approximately the time at which the images are detected”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the system and method for determining the effect of camera temperature variations on stereo DIC measurements (as taught by Pan) to use the image sensor on a machine (as taught by Bilbrey) because the combination be used to aid in video editing, web cam applications, videophone operations, etc.(Bilbrey ¶¶0002).
Allowable Subject Matter
Claims 5-8 and 18 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 5, the prior art of record (Pan, in view of de Souza Bachour) teaches that it was known at the time the application was filed to generate a plot of average pixel intensity fluctuations over time (de Souza Bachour Fig. 3)
However, the prior art, alone or in combination, does not appear to teach or suggest generating a noise profile comprising noise estimates for a plurality of different average pixel intensity values (i.e., noise v. intensity, as opposed to the prior art’s average intensity v. time).
PNG
media_image1.png
346
572
media_image1.png
Greyscale
PNG
media_image2.png
118
136
media_image2.png
Greyscale
Applicant’s Fig. 3
de Souza Bachour Fig. 3
Claims 6-8 are objected to for the same reason as applied to claim 5 discussed above due to dependency.
Claim 18 is objected to for the same reason as applied to claim 5 discussed above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOO J SHIN whose telephone number is (571)272-9753. The examiner can normally be reached M-F; 10-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571)272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Soo Shin/Primary Examiner, Art Unit 2667 571-272-9753
soo.shin@uspto.gov