Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 12-15 and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shinmei et al (US 20110317028 A1) in view of Jo (US 20130271623 A1).
Regarding claim 1, Shinmei teaches An image processing method to distinguish moving objects from Light-Emitting Diode (LED) flickers in an image based on the pixel intensity-changing patterns (Fig. 45), the method comprising:
capturing a plurality of frames of a scene using a respectively corresponds to a plurality of different exposure settings (Fig. 45; paras. 0235-0236; capturing long exposure image and short exposure image);
obtaining a plurality of vectors of pixel intensities respectively from the plurality of frames, each of the plurality of vectors of pixel intensities corresponding to a cluster of pixels in one of the plurality of frames (Fig. 45; paras. 0235-0236);
generating an intensity-changing pattern for the cluster of pixels across the plurality of frames based on the plurality of vectors of pixel intensities (Figs. 43, 45; paras. 00366-0368); and
determining, based on the intensity-changing pattern, whether the cluster of pixels captures a moving object or an LED light source (Figs. 43, 45; paras. 00366-0368; “FIG. 43 includes a step ST62 executed to determine whether a pixel or a pixel area also referred to as a pixel block is a static one pertaining to an image of an image taking object or a dynamic one pertaining to an image of a moving object”),
but fails to teach
capturing a plurality of frames of a scene using a spatially multiplexed image sensor, wherein the spatially multiplexed image sensor comprises a plurality of pixel groups, each pixel group comprises a plurality of pixels with a plurality of different exposure settings.
However, in the same field of endeavor Jo teaches
capturing a plurality of frames of a scene using a spatially multiplexed image sensor, wherein the spatially multiplexed image sensor comprises a plurality of pixel groups, each pixel group comprises a plurality of pixels with a plurality of different exposure settings (Figs. 8-13; paras. 0129-0133; “A pixel array that has pixels of different exposure times such as the short-time exposure pixels and the long-time exposure pixels in one image sensor as in FIG. 8 is called a spatial varying exposure (SVE) array”).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Jo in Shinmei to have capturing a plurality of frames of a scene using a spatially multiplexed image sensor, wherein the spatially multiplexed image sensor comprises a plurality of pixel groups, each pixel group comprises a plurality of pixels with a plurality of different exposure settings for improving speed of acquiring different exposure images for faster HDR imaging yielding a predicted result.
Regarding claim 2, the combination of Shinmei and Jo teaches everything as claimed in claim 1. In addition, Jo teaches wherein the plurality of different exposure settings correspond to linearly increased exposure times by a predetermined factor (para. 0178).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Jo in the combination to have wherein the plurality of different exposure settings correspond to linearly increased exposure times by a predetermined factor for maintaining proper different exposure ratio for optimizing HDR imaging yielding a predicted result.
Regarding claim 3, the combination of Shinmei and Jo teaches everything as claimed in claim 2. In addition, Shinmei teaches wherein the generating the intensity-changing pattern for the cluster of pixels across the plurality of frames based on the plurality of vectors of pixel intensities comprises:
performing pixel intensity alignment (by amplifier 123 and 124 using the exposure ratio) among the plurality of vectors of pixel intensities by using the predetermined factor, thereby obtaining a plurality of vectors of aligned pixel intensities (Figs. 36, 40, 45; paras. 0128-0134; “The gain output from the CPU 131 can be the reciprocal of an exposure ratio or a corrected value of the reciprocal”).
Regarding claim 4, the combination of Shinmei and Jo teaches everything as claimed in claim 3. In addition, Shinmei teaches wherein the performing pixel intensity alignment among the plurality of vectors of pixel intensities by using the predetermined factor comprises:
for a first vector of pixel intensities corresponding to a shorter exposure setting, enhancing the first vector of pixel intensities by the predetermined factor to align with a second vector of pixel intensities corresponding to a longer exposure setting (enhanced by amplifier 123 and 124 before input into synthesizing processing section 126H/J where differences/vectors are calculated by a static/dynamic-state determination section 1270J) (Figs. 36, 40, 45; paras. 0128-0134).
Regarding claim 12, the combination of Shinmei and Jo teaches everything as claimed in claim 1. In addition, Shinmei and Jo teach A system (Shinmei; paras. 0402-0406) associated with a spatially multiplexed image sensor (Jo’s “one image sensor as in FIG. 8 is called a spatial varying exposure (SVE) array” as taught in claim 1), comprising one or more processors and one or more non-transitory computer-readable memories coupled to the one or more processors and configured with instructions executable by the one or more processors to cause the system to perform operations (Shinmei; paras. 0402-0406) comprising: (corresponding features as taught in claim 1).
Regarding claims 13-15, claims 13-15 reciting features corresponding to claims 2-4 are also rejected for the same reasons above, respectively.
Regarding claim 12, the combination of Shinmei and Jo teaches everything as claimed in claim 1. In addition, Shinmei and Jo teach A non-transitory computer-readable storage medium (Shinmei; paras. 0402-0406) associated with a spatially multiplexed image sensor (Jo’s “one image sensor as in FIG. 8 is called a spatial varying exposure (SVE) array” as taught in claim 1), the non-transitory computer-readable storage medium configured with instructions executable by one or more processors to cause the one or more processors to perform operations (Shinmei; paras. 0402-0406) comprising: (corresponding features as taught in claim 1).
Allowable Subject Matter
Claims 5-11 and 16-19 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Nakamura et al (US 10432875 B2): generates an HDR image on the basis of the flicker-corrected first exposure image and the second exposure image and then outputs the HDR image. The flicker parameter estimation unit can execute respective processing for each of the detection areas, to estimate the flicker parameter for each of the detection areas.
Ashida et al (US 20130258134 A1): a flicker that flashes is detected for each of the divided areas, it is detected, for each of the divided areas where the flicker has been detected, whether a divided area is an LED area including light from a light emitting diode (LED) based on a luminance difference between a luminance in a turn-on state and a luminance in an turn-off state, and LED area information is output.
Nakasuji et al (US 20070046790 A1): discriminating whether or not each area including a plurality of pixels is the still image area on the frame image including the moving object OB. The motion vector of the object is obtained using the frame images, and the area through which the object has passed between the frame images from the motion vector, thereby judging the area as the motion area.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Quan Pham whose telephone number is (571)272-4438. The examiner can normally be reached Mon-Fri 9am-7pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at (571) 272-7564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Quan Pham/Primary Examiner, Art Unit 2637