Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This action is in response to the applicant's communication filed on 01/22/2024. In virtue of this communication, claims 1-19 filed on 01/22/2024 are currently pending in the instant application.
Information Disclosure Statement
The information Disclosure statement (IDS) form PTO-1449, filed on 01/22/2024 are in compliance with the provisions of CFR 1.97. Accordingly, the information disclosed therein was considered by the examiner.
Drawings
The drawings were received on 01/22/2024 have been reviewed by Examiner and they are acceptable.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 13 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 13 recites the limitation " the number of taps " in line 2. There is insufficient antecedent basis for this limitation in the claim.
Further Claim 13 limitation “does not perform the filer processing in a case where it is determine that the number of taps is insufficient”, is unclear. Examiner notes the applicant’s specification discloses in ¶[0055] which discloses “in a case where it is determined that the kernel size does not exceed the number of taps, the flow proceeds to step S1007” , and step S1007 based on figure 6, is “filter processing to current block”, Examiner notes it is not clear whether the filtration is stopped/ ended or return back to a first block of the image, there is no language supporting of stopping the process of filtering based on the insufficiency of the number of taps of filter in the specification. As a result the claim language is unclear and Examiner interprets the claim as best understood for prior art rejection as “wherein the processor is configured to acquire the number of taps of a filter kernel in the filter processing for all pixels of the image.”
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-3, 7-8, 12-16, and 17-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597).
As per claim 1, An image processing apparatus comprising: a memory storing instructions; and a processor configured to execute the instructions to:
“acquire distance information about an in-focus position,”(Hatakeyama, ¶[0103] discloses obtain information representing a condition of the image pickup optical system 101 (such as a focal length, an aperture value, and an object distance (an image pickup distance)). )
“and perform filter processing using an aberration filter determined based on the distance information and optical information about an optical system, for image data acquired using an image sensor,” (Hatakeyama, ¶[0104] discloses selects from a memory 108 an image restoration filter suitable for the condition of the image pickup optical system 101 which is obtained from the optical system condition information. ¶[0107] discloses one method of selecting an image restoration filter near a position corresponding to the actual image pickup condition calculates a distance (or condition difference amount) in the image pickup condition space among the actual image pickup condition and a plurality of image pickup conditions stored in the image restoration filter, and selects the image restoration filter located at a position having the shortest distance. ¶[0111] discloses plurality of image restoration filters located at positions near the actual image pickup condition, performs interpolation processing for the plurality of image restoration filters according to the condition difference amount, and thereby generates an image restoration filter suitable for the image pickup condition. ¶[0015],¶[0119] discloses obtains image pickup condition information from the condition detector 107, and selects the image restoration filter suitable for the image pickup condition.)
However Hatakeyama does not explicitly disclose the following which would have been obvious in view of Findlay from similar filed of endeavor “ wherein the aberration filter has a parameter for adjustment that makes closer to each other a characteristic of a first blurred image formed on an image plane due to defocus on a close distance side of the in-focus position, and a characteristic of a second blurred image formed on the image plane due to defocus on an infinity side of the in-focus position.” (Findlay, ¶[0094] discloses defocus of the camera system, which may be dependent on the position of the object with respect to the focused distance of the camera in object space. ¶[0098] discloses iterative restoration process described in the next paragraphs, then ¶[0101] discloses the restored image is free of defocus artifacts when the coding and decoding kernels are equal, and that the variance is maximized in this case, see FIG. 10, which shows the variance of the restored (and resealed) image of Lena as a function of the defocus parameter W20 kernel used. ¶[0106] discloses the maximum defocus for an invariant modular transfer function (MTF)(blur) is |W20|.sub.max=3α(1-v), Examiner notes the absolute value of characteristic of MTF/blur is treated for either side of the best focus(opposite signs). further see ¶[0108] for reinforcing the magnitude based treatment.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Findlay technique of image artifact removal into Hatakeyama technique to provide the known and expected uses and benefits of Findlay technique over image processing technique of Hatakeyama. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Findlay to Hatakeyama in order to improve quality of images and increase computational speed. (Refer to Findlay paragraphs [0014 and 0015].)
Claims 14 and 17 have been analyzed and are rejected for the reasons indicated in claim 1 above. Additionally, the rationale and motivation to combine the Hatakeyama and Findlay references, presented in rejection of claim 1, apply to these claims.
As per claim 2, The image processing apparatus according to claim 1, “wherein the optical information includes information about at least one of an aperture diameter, zoom, and focus.” (Hatakeyama, ¶[0115] discloses an aperture value changes when an aperture diameter of the stop 101a is controlled. A position of the focus lens 101b is controlled by an autofocus ("AF") mechanism or manual focus mechanism (not illustrated) for focusing according to an object distance. )
Claims 15 and 18 have been analyzed and are rejected for the reasons indicated in claim 2 above.
As per claim 3, The image processing apparatus according to claim 1, “wherein the optical information includes information about an aberration state on the image plane.” ( Hatakeyama, ¶[0105] discloses The image restoration filters stored in the memory 108 are discretely arranged in the image pickup condition space having three image pickup conditions allocated to axes of a focal length (state A), an aperture value (state B), and an object distance (state C).)
Claims 16 and 19 have been analyzed and are rejected for the reasons indicated in claim 3 above.
As per claim 7, The image processing apparatus according to claim 1, “wherein the processor is configured to acquire an aberration characteristic based on the distance information and the optical information, and wherein the parameter is a parameter based on the aberration characteristic.” (Hatakeyama, ¶[0065] disclose s an object image formed by the image pickup optical system, and deteriorates due to the OTF of the aberration of the image pickup optical system. .¶[0073] discloses The number of taps (or cells) of the image restoration filter can be determined according to the aberrational characteristic of the image pickup optical system and the required restoration accuracy. ¶[0076] The image restoration filter can be obtained by calculating or measuring the OTF of the image pickup system (image pickup optical system), and by performing an inverse-Fourier transform of a function based on the inverse function of the OTF. Further ¶[0079],¶[0103] discloses the image pickup optical system 101 (such as a focal length, an aperture value, and an object distance (an image pickup distance). Further ¶[0104], ¶[0105] discloses The image restoration filters stored in the memory 108 are discretely arranged in the image pickup condition space having three image pickup conditions allocated to axes of a focal length (state A), an aperture value (state B), and an object distance (state C). ¶[0112] discloses The OTF used to generate the image restoration filter can be calculated or measured at the actual condition of the image pickup apparatus itself or the image pickup optical system. )
As per claim 8, The image processing apparatus according to claim 7, “wherein the processor is configured to acquire the aberration characteristic from a lens apparatus having the optical system, an image pickup apparatus having the image sensor, or an external device.” (Hatakeyama, ¶[0065] discloses the OTF of the aberration of the image pickup optical system including a lens and a variety of optical filters. ¶[0112] discloses The OTF used to generate the image restoration filter can be calculated utilizing an optical design tool and an optical analytical tool. Moreover, the OTF can be measured at the actual condition of the image pickup apparatus itself or the image pickup optical system.)
As per claim 12, The image processing apparatus according to claim 1, “wherein the processor performs the filter processing for each divided area in the image data.” (Hatakeyama ¶[0018] discloses a two-dimensional image, the image restoration filter is usually a two-dimensional filter having a tap (cell) corresponding to each pixel of the image. In general, as the number of taps in the image restoration filter increases, the restoration precision improves. ¶[0073] discloses The number of taps (or cells) of the image restoration filter can be determined according to the aberrational characteristic of the image pickup optical system and the required restoration accuracy, Each tap of the image restoration filter corresponds to each pixel of the input image used for the image restoration processing. ¶[0122] discloses the phase correction processing uses the image restoration filter and each tap (cell) of the image restoration filter needs to correspond to each pixel of the image.)
As per claim 13, The image processing apparatus according to claim 1, “wherein the processor is configured to acquire the number of taps of a filter kernel in the filter processing, and does not perform the filter processing in a case where it is determined that the number of taps is insufficient.” (Examiner notes the claim has been rejected in view of 112 rejection stated above. Hatakeyama, ¶[0018] discloses as the number of taps in the image restoration filter increases, the restoration precision improves. ¶[0073] discloses The number of taps (or cells) of the image restoration filter can be determined according to the aberrational characteristic of the image pickup optical system and the required restoration accuracy. ¶[0122] discloses the phase correction processing uses the image restoration filter and each tap (cell) of the image restoration filter needs to correspond to each pixel of the image, the correction is required before the PSF is transformed by the geometric transformation in the distortion correction. When the image restoration filter is applied to the image in which the distortion is corrected, it is necessary to convert the image restoration filter according to the coordinate transformation in the distortion correction.)
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Mathieu (US 20100328517).
As per claim 4, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Mathieu from similar filed of endeavor “wherein the parameter is a parameter for sharpening processing or blurring processing that makes closer to each other a spatial frequency characteristic of the first blurred image and a spatial frequency characteristic of the second blurred image.” (Mathieu, ¶[0068] discloses filtering process that seeks to sharpen a signal. an optimized gain function (similar to Wiener's filter) is applied to reduce noise amplification during the contrast-enhancement process. ¶[0070] discloses "raw" MTFs as measured at different defocus distances ΔF of 10 mm from best focus between extremes of -50 mm and +50 mm of defocus. ¶[0071] discloses The above-mentioned MTF gain function used to restore or enhance the raw MTF is a three-dimensional function G(u, v, d), wherein u is the spatial frequency along the X axis, v is the spatial frequency along the Y axis, and d is the distance of the object in the allowed extended depth of field DOF (d thus corresponds to the object distance D.sub.OB). ¶[0074] discloses The after-digital process is preferably optimized to deliver substantially the same MTF at any distance. ¶[0079] discloses the applied gain of the digital filter is optimized or enhanced to obtain the maximum output MTF' while controlling the gain or noise.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Mathieu technique of depth of field (EDOF) imaging system into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Mathieu technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Mathieu to Hatakeyama as modified as Findlay in order to improve quality of images. (Refer to Mathieu paragraphs [0007].)
Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Morgan-Mar et al. (US 2014/0152886).
As per claim 5, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Morgan-Mar from similar filed of endeavor “wherein the parameter is a parameter based on a combined function of an optical transfer function of the first blurred image and an inverse function of an optical transfer function of the second blurred image.” (Morgan-Mar,¶[0067] discloses Taking the ratio of the Fourier transforms of corresponding patches in the two images see the equation, ¶[0069] discloses This assignment allows an interpretation in which it is possible to consider f1 as a more blurred version of f2, related by a relative optical transfer function OTFr given by the spectral ratio, see the equation. ¶ [0071] The space-varying relative point spread function PSFr is the inverse Fourier transform of (OTF1/OTF2)).
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Morgan-Mar technique of modifying blur in images into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Morgan-Mar technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Morgan-Mar to Hatakeyama as modified as Findlay in order to provide realistic images with higher accuracy. (Refer to Morgan-Mar paragraphs [0018].)
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Garg et al. (US 2022/0375042).
As per claim 6, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Garg from similar filed of endeavor “wherein the parameter is a parameter for processing that makes closer to each other blurred sizes or blurred shapes of the first blurred image and the second blurred image, which have the same degree of the defocus.” (Garg, ¶[0003-0004] disclose s dual-pixel image data that includes a first sub-image and a second sub-image. determining a loss value using a loss function that includes one or more of: an equivalence loss term configured to determine a difference between (i) a convolution of the first sub-image with the second blur kernel and (ii) a convolution of the second sub-image with the first blur kernel. ¶[0021] discloses the dual-pixel image sensor may be configured to generate dual-pixel image data that includes a first sub-image generated and a second sub-image generated. ¶[0022] discloses When light reflected from the portion of a scene is out of focus with the corresponding portion of the dual-pixel image sensor, each photosite of the corresponding dual pixel may generate a different signal. Thus, dual-pixel image data generated by the dual-pixel image sensor may contain information indicative of an extent of defocus associated with each dual pixel, and may thus be used to adjust the extent of apparent blurring associated with the dual-pixel image data. ¶[0025] discloses the equivalence term may be configured to determine a difference between (i) a convolution of the first sub-image with the second blur kernel and (ii) a convolution of the second sub-image with the first blur kernel. The equivalence loss term may thus incentivize the optimization to generate blur kernels that increase and/or maximize an extent of symmetry between convolutions of the blur kernels and the dual-pixel sub-images. ¶[0026].)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Garg technique of defocus blur removal into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Garg technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Garg to Hatakeyama as modified as Findlay in order to accurately adjust and correct blur images. (Refer to Garg paragraphs [0001].)
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Naito et al. (US 2021/0067664).
As per claim 9, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Naito from similar filed of endeavor “wherein the distance information is information about a defocus amount detected by an imaging-surface phase-difference detecting method using the image sensor.” (Naito, ¶[0037] discloses The image sensor 300 is a CMOS sensor, for example, and has image pickup pixels that pick up an object and phase difference detection pixels that detect a phase difference of object images for auto-focusing of an imaging surface phase difference method. The CPU 21 calculates a defocus amount on the basis of the phase difference (distance between pupil-divided images) of object images in segmented regions that is obtained from the pixel signals output from the phase difference detection pixels)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Naito technique of using image sensors into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Naito technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Naito to Hatakeyama as modified as Findlay in order to provide better image analysis ability to the system. (Refer to Naito paragraphs [0003].)
Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Miyazawa et al. (US 2019/0124266).
As per claim 10, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Miyazawa from similar filed of endeavor “wherein the distance information is information about a defocus amount detected by a phase-difference detecting sensor.” (Miyazawa, ¶[0047] discloses The distance acquisition unit 181 acquires the distance information using the AF sensor for phase difference detection.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Miyazawa technique of image processing controlling into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Miyazawa technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Miyazawa to Hatakeyama as modified as Findlay in order to provide an effective correction of imaging. (Refer to Miyazawa paragraphs [0005].)
Claim(s) 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hatakeyama (US 2011/0135216), in view of Findlay et al. (US 2010/0008597), further in view of Fujiwara et al. (US 2021/0364793).
As per claim 11, The image processing apparatus according to claim 1, However Hatakeyama as modified by Findlay does not explicitly disclose the following which would have been obvious in view of Fujiwara from similar filed of endeavor “wherein the distance information is information about an object distance detected by a distance sensor.” (Fujiwara, ¶[0052] discloses the distance sensor measures the distance to the object using, for example, a spatial recognition technique such as Depth From Defocus technology.)
Before the effective filing date of the claimed invention it would have been obvious to a person of ordinary skill in the art to combine Fujiwara technique of using distance sensors into Hatakeyama as modified as Findlay technique to provide the known and expected uses and benefits of Fujiwara technique over image processing technique of Hatakeyama as modified as Findlay. The proposed combination would have constituted a mere arrangement of old elements with each performing their known function, the combination yielding no more than one would expect from such an arrangement.
Therefore, it would have been obvious to a person of ordinary skill in the art to incorporate Fujiwara to Hatakeyama as modified as Findlay in order to enhance object detection in images. (Refer to Fujiwara paragraphs [0003].)
Contact
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHAGHAYEGH AZIMA whose telephone number is (571)272-1459. The examiner can normally be reached Monday-Friday, 9:30-6:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Vincent Rudolph can be reached at (571)272-8243. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHAGHAYEGH AZIMA/ Examiner, Art Unit 2671