Prosecution Insights
Last updated: April 19, 2026
Application No. 18/439,186

IMAGING DEVICE, METHOD OF DRIVING IMAGING DEVICE, AND PROGRAM

Final Rejection §103
Filed
Feb 12, 2024
Examiner
CHIU, WESLEY JASON
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Fujifilm Corporation
OA Round
2 (Final)
61%
Grant Probability
Moderate
3-4
OA Rounds
2y 6m
To Grant
90%
With Interview

Examiner Intelligence

Grants 61% of resolved cases
61%
Career Allow Rate
288 granted / 469 resolved
-0.6% vs TC avg
Strong +28% interview lift
Without
With
+28.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
32 currently pending
Career history
501
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
53.3%
+13.3% vs TC avg
§102
21.0%
-19.0% vs TC avg
§112
21.4%
-18.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 469 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. Claim Amendments Acknowledgment of receiving amendments to the claims, which were received by the Office on 12/16/2025. Response to Arguments Applicant’s arguments with respect to claims 1-14 and 16-18 have been considered but are moot because the arguments do not apply to the same combination of references being used in the current rejection. Applicant’s arguments are directed solely to the claimed invention as amended 12/16/2025, which has been rejected under new ground of rejection necessitated by amendment. See rejection below for full detail. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-2, 6-8, 13-14 and 17-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1). Regarding claim 1, Ihara et al. (hereafter referred as Ihara) teaches an imaging device (Ihara, Fig. 36) comprising: an image sensor that has a plurality of first phase difference pixels (Ihara, Fig. 1, pixels Za. Paragraph 0075) and a plurality of second phase difference pixels (Ihara, Fig. 1, pixels Zb. Paragraph 0075), and outputs a captured image (Ihara, Fig. 36, light reception section 461, Paragraph 0357); and at least one processor (Ihara, Fig. 36, ADC 462, encoding section 101 and signal processing section 453, Paragraph 0106, 0348, 0351 and 0358), wherein the at least one processor is configured to: convert signals obtained from the plurality of first phase difference pixels and the plurality of second phase difference pixels into first phase difference information and second phase difference information by performing processing that includes comparing a value of a pixel-of-interest with values of peripheral pixels adjacent to the pixel-of-interest (Ihara, Fig. 3-4, Paragraphs 0100 and 0103-0106, Phase difference pixels signals are converted to digital signals and encoded. Encoding includes comparing adjacent pixel values to determine DPCM Residuals.); and acquire distance distribution information corresponding to the captured image by based on the first phase difference information on the second phase difference information (Ihara, Paragraphs 0365-0366). However, Ihara does not teach acquiring distance distribution information corresponding to the captured image by performing a shift operation on the first phase difference information on the second phase difference information; and acquiring subject distance information indicating a distance to a subject existing in a focusing target region and peripheral distance information indicating a distance to an object existing in a peripheral region of the focusing target region based on the distance distribution information. In reference to Onuki et al. (hereafter referred as Onuki), Onuki teaches at least one processor is configured to: convert signals obtained from the plurality of first phase difference pixels and the plurality of seconds phase difference pixels into first phase difference information and second phase difference information (Onuki, Paragraph 0060, “A/D-converts an acquired image signal”, Fig. 14, Paragraphs 0119-0120); acquire distance distribution information corresponding to the captured image by performing a shift operation (Onuki, Fig. 14, Paragraphs 0121-0122, Generating and using the shift amounts is interpreted as the shift operation.) on the first phase difference information on the second phase difference information (Onuki, Fig. 17, Paragraphs 0134-0137); and acquire subject distance information indicating a distance to a subject existing in a focusing target region and peripheral distance information indicating a distance to an object existing in a peripheral region of the focusing target region based on the distance distribution information (Onuki, Fig. 17, Paragraphs 0134-0137, Defocus amounts indicate distance.). These arts are analogous since they are both related to imaging devices using phase difference pixels. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the invention of Ihara with the method of creating a defocus map as seen in Onuki to allow the device to determine the distance to multiple subjects in the image for focus control. Claims 13 and 14 are rejected for the same reasons as claim 1. Regarding claim 2, the combination of Ihara and Onuki teaches the imaging device according to claim 1 (see claim 1 analysis), further comprising: a focus lens (Ihara, Fig. 36, lens 451, Paragraph 0351, Onuki, Fig. 1, lens group 105, Paragraph 0056) wherein the at least one processor is configured to perform focusing control of controlling a position of the focus lens based on the subject distance information (Ihara, Paragraph 0351, Onuki, Paragraph 0061). Regarding claim 6, the combination of Ihara and Onuki teaches the imaging device according to claim 1 (see claim 1 analysis), wherein the at least one processor is configured to: record the captured image and the distance distribution information (Ihara, Paragraph 0353, Onuki, Figs. 17B and 20, Steps S166-S167, Paragraphs 0134 and 0150-0151); and acquire the subject distance information and the peripheral distance information based on the distance distribution information (Onuki, Fig. 17, Paragraphs 0134-0137). Regarding claim 7, the combination of Ihara and Onuki teaches the imaging device according to claim 6 (see claim 6 analysis), wherein the at least one processor is configured to generate and record an image file including the captured image and the distance distribution information (Ihara, Paragraph 0353, Onuki, Fig. 20, Steps S166-S167, Paragraphs 0150-0151). Regarding claim 8, the combination of Ihara and Onuki teaches the imaging device according to claim 6 (see claim 6 analysis), wherein the peripheral distance information included in the distance distribution information includes a relative distance of an object in the peripheral region with respect to the focusing target region (Onuki, Fig. 17B, Paragraph 0134-0137, Defocus amounts indicate a relative distance). Regarding claim 17, the combination of Ihara and Onuki teaches the imaging device according to claim 1 (see claim 1 analysis), wherein the at least one processor is configured to convert signals obtained from the plurality of first phase difference pixels and the plurality of second phase difference pixels into first phase difference information and second phase difference information by performing local binary encoding processing that includes comparing a value of a pixel-of-interest with values of peripheral pixels adjacent to the pixel-of-interest (Ihara, Fig. 3-4, Paragraphs 0100 and 0103-0106, Phase difference pixels signals are converted to digital signals (binary) and encoded. The binary encoding may be considered to be “local” since it is performed within the device or since adjacent pixels are compared.). Regarding claim 18, Ihara teaches an imaging device (Ihara, Fig. 36) comprising: an image sensor that has a plurality of first phase difference pixels (Ihara, Fig. 1, pixels Za. Paragraph 0075) and a plurality of second phase difference pixels (Ihara, Fig. 1, pixels Zb. Paragraph 0075), and outputs a captured image (Ihara, Fig. 36, light reception section 461, Paragraph 0357); and at least one processor (Ihara, Fig. 36, ADC 462, encoding section 101 and signal processing section 453, Paragraph 0106, 0348, 0351 and 0358), wherein the at least one processor is configured to: convert signals obtained from the plurality of first phase difference pixels and the plurality of second phase difference pixels into first phase difference information and second phase difference information by using a local binary encoding method, wherein the local binary encoding method includes comparing a value of a pixel-of-interest with values of peripheral pixels adjacent to the pixel-of-interest (Ihara, Fig. 3-4, Paragraphs 0100 and 0103-0106, Phase difference pixels signals are converted to digital signals (binary) and encoded. Encoding includes comparing adjacent pixel values to determine DPCM Residuals. The binary encoding may be considered to be “local” since it is performed within the device or since adjacent pixels are compared.); and acquire distance distribution information corresponding to the captured image by based on the first phase difference information on the second phase difference information (Ihara, Paragraphs 0365-0366). However, Ihara does not teach acquiring distance distribution information corresponding to the captured image by performing a shift operation on the first phase difference information on the second phase difference information; and acquiring subject distance information indicating a distance to a subject existing in a focusing target region and peripheral distance information indicating a distance to an object existing in a peripheral region of the focusing target region based on the distance distribution information. In reference to Onuki et al. (hereafter referred as Onuki), Onuki teaches at least one processor is configured to: convert signals obtained from the plurality of first phase difference pixels and the plurality of seconds phase difference pixels into first phase difference information and second phase difference information (Onuki, Paragraph 0060, “A/D-converts an acquired image signal”, Fig. 14, Paragraphs 0119-0120); acquire distance distribution information corresponding to the captured image by performing a shift operation (Onuki, Fig. 14, Paragraphs 0121-0122, Generating and using the shift amounts is interpreted as the shift operation.) on the first phase difference information on the second phase difference information (Onuki, Fig. 17, Paragraphs 0134-0137); and acquire subject distance information indicating a distance to a subject existing in a focusing target region and peripheral distance information indicating a distance to an object existing in a peripheral region of the focusing target region based on the distance distribution information (Onuki, Fig. 17, Paragraphs 0134-0137, Defocus amounts indicate distance.). These arts are analogous since they are both related to imaging devices using phase difference pixels. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the invention of Ihara with the method of creating a defocus map as seen in Onuki to allow the device to determine the distance to multiple subjects in the image for focus control. Claim(s) 3-4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1) in view of Yoneyama et al. (US 2013/0162839 A1). Regarding claim 3, the combination of Ihara and Onuki teaches the imaging device according to claim 2 (see claim 2 analysis). However, the combination of Ihara and Onuki does not teach wherein the at least one processor is configured to: detect an object existing between the subject and the imaging device based on the subject distance information and the peripheral distance information; and in a case where a distance within an angle of view of the object with respect to the subject is reduced, change the focusing control. In reference to Yoneyama et al. (hereafter referred as Yoneyama), Yoneyama teaches detecting an object (Yoneyama, Fig. 12, subject K) existing between a subject (Yoneyama, Fig. 12, subject H) and the imaging device based on the subject distance information and the peripheral distance information (Yoneyama, Fig. 12, Paragraph 0132-0133, Fig. 13, Paragraph 0140-0141, Phase-difference may be used for determining distance in place of contrast information.); and in a case where a distance within an angle of view of the object with respect to the subject is reduced, change the focusing control (Yoneyama, Fig. 5, Step S110-S112, Paragraphs 0067-0068, Fig. 8, Steps S402-S410, Paragraphs 0106-0110, Tracking processing is used to determine the focus subject. Changes to the tracking processing is seen to be changes to focusing control.). These arts are analogous since they are all related to imaging devices performing focusing. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara and Onuki with the method of changing focus control as seen in Yoneyama to allow the device to continue focusing on a subject after determining the subject is obscured. Regarding claim 4, the combination of Ihara, Onuki and Yoneyama teaches the imaging device according to claim 3 (see claim 3 analysis), wherein the at least one processor is configured to estimate a position of the subject based on a past position of the subject in a case where the object blocks the subject (Yoneyama, Fig. 7, Steps S306-S310, Paragraphs 0101-0103, Fig. 8, Steps S414-S418, Paragraph 0113-0114, Fig. 11, distance information calculation area F1-F3, Paragraphs 0128-0129, Distance information calculation area F1-F3 is seen to be an estimate of a position of the subject based on a past position of the subject.). Alternatively, claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1) in view of Yoneyama et al. (US 2013/0162839 A1) in view of Muramatsu (US 2011/0234885 A1). Alternatively, regarding claim 4, the combination of Ihara, Onuki and Yoneyama teaches the imaging device according to claim 3 (see claim 3 analysis). However, the combination of Ihara, Onuki and Yoneyama does not teach wherein the at least one processor is configured to estimate a position of the subject based on a past position of the subject in a case where the object blocks the subject. In reference to Muramatsu, Muramatsu teaches estimating a position of the subject based on a past position of the subject in a case where the object blocks the subject (Muramatsu, Figs. 11B and 11C, Paragraph 0064). These arts are analogous since they are all related to imaging devices performing focusing. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara, Onuki and Yoneyama with the method of moving a search area based on an estimated position of the subject as seen in Muramatsu to increase the likelihood of recapturing the tracking target subject (Muramatsu, Paragraphs 0063-0064). Claim(s) 5 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1) in view of Yoneyama et al. (US 2013/0162839 A1) in view of Imamiya (US 2021/0120170 A1). Regarding claim 5, the combination of Ihara, Onuki and Yoneyama teaches the imaging device according to claim 4 (see claim 4 analysis), wherein the at least one processor is configured to: move the focusing target region to the estimated position of the subject (Yoneyama, Fig. 11, Frame position E1-E3, Paragraphs 0128-0129, Frame positions are the focusing target region.). However, the combination of Ihara, Onuki and Yoneyama does not teach in a case where the subject is not detected from the focusing target region after the movement, move the focusing target region to a position of the object. In reference to Imamiya, Imamiya teaches move the focusing target region to the estimated position of the subject (Imamiya, Figs. 5-7, Paragraphs 0068-0069, Fig. 9, Step S910, Paragraph 0097 and 0099); and in a case where the subject is not detected from the focusing target region after the movement, move the focusing target region to a new subject in close-range (Imamiya, Fig. 9, Steps S910-S913, Paragraphs 0093 and 0096-0102). These arts are analogous since they are all related to imaging devices performing focusing. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara, Onuki and Yoneyama with the method of changing the focusing target to a new subject in close range as seen in Imamiya to allow the device to focus on a different target if a previous target is lost. Further, the limitation “in a case where the subject is not detected from the focusing target region after the movement, move the focusing target region to a position of the object” is met since the position of the object would be a close-range subject. Claim(s) 9-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1) in view of Sasaki (US 2013/0308018 A1). Regarding claim 9, the combination of Ihara and Onuki teaches the imaging device according to claim 8 (see claim 8 analysis). However, the combination of Ihara and Onuki does not teach wherein the at least one processor is configured to perform correction processing on at least one of the focusing target region or the peripheral region of the captured image based on the distance distribution information. In reference to Sasaki, Sasaki teaches performing correction processing on at least one of the focusing target region or the peripheral region of the captured image based on the distance distribution information (Sasaki, Fig. 6, Paragraphs 0047-0050). These arts are analogous since they are all related to imaging devices performing focusing. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara and Onuki with the correction processing as seen in Sasaki to correct for chromatic aberrations in the image based on the distances of the objects. Regarding claim 10, the combination of Ihara, Onuki and Sasaki teaches the imaging device according to claim 9 (see claim 9 analysis), wherein the at least one processor is configured to change the correction processing on the object in accordance with the relative distance (Sasaki, Fig. 6, Paragraphs 0048, Onuki, Fig. 17B, Paragraph 0134-0137, Defocus amounts indicate a relative distance). Regarding claim 11, the combination of Ihara, Onuki and Sasaki teaches the imaging device according to claim 10 (see claim 10 analysis), wherein the correction processing on the object is chromatic aberration correction (Sasaki, Fig. 6, Paragraphs 0047-0050). Claim(s) 12 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1) in view of Rosmeulen (US 2020/0402296 A1). Regarding claim 12, the combination of Ihara and Onuki teaches the imaging device according to claim 6 (see claim 6 analysis), wherein the distance distribution information includes distance information corresponding to a plurality of pixels constituting the captured image (Onuki, Fig. 17, Paragraphs 0134-0137). However, the combination of Ihara and Onuki does not teach the at least one processor is configured to composite a stereoscopic image with the captured image by using the distance information to generate a composite image. In reference to Rosmeulen, Rosmeulen teaches compositing a stereoscopic image (Rosmeulen, Fig. 1, second image 13) with the captured image (Rosmeulen, Fig. 1, first image 12) by using the distance information to generate a composite image (Rosmeulen, Fig. 1, AR image 11, Paragraphs 0060, 0063-0064). These arts are analogous since they are all related to imaging devices. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara and Onuki with the method of compositing a stereoscopic image with the captured image as seen in Rosmeulen to allow the device to generate AR images and provide greater functionality to the device. Claim(s) 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ihara et al. (US 2022/0028040 A1) in view of Onuki et al. (US 2012/0300104 A1)in view of Takayanagi et al. (US 2010/0157127 A1). Regarding claim 16, the combination of Ihara and Onuki teaches the imaging device according to claim 1 (see claim 1 analysis), wherein the at least one processor is configured to: distinguish the subject existing in the focusing target region from the object existing in the peripheral region based on the subject distance information and the peripheral distance information (Onuki, Fig. 17, Paragraphs 0134-0137). However, the combination of Ihara and Onuki does not teach performing correction to reduce the luminance of the object. In reference to Takayanagi et al. (hereafter referred as Takayanagi), Takayanagi teaches performing correction to reduce the luminance of objects in a peripheral region (Takayanagi, Fig. 3C, Paragraph 0059). These arts are analogous since they are all related to imaging devices. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention (AIA ) to modify the combination of Ihara and Onuki with the method of reduce the luminance of objects in a peripheral region as seen in Takayanagi to emphasize the in0focus center subject (Takayanagi, Paragraph 0056 and 0059). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WESLEY JASON CHIU whose telephone number is (571)270-1312. The examiner can normally be reached Mon-Fri: 8am-4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Twyler Haskins can be reached at (571) 272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WESLEY J CHIU/ Examiner, Art Unit 2639 /TWYLER L HASKINS/Supervisory Patent Examiner, Art Unit 2639
Read full office action

Prosecution Timeline

Feb 12, 2024
Application Filed
Jul 30, 2025
Examiner Interview (Telephonic)
Sep 10, 2025
Non-Final Rejection — §103
Dec 16, 2025
Response Filed
Jan 15, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593139
IMAGE SIGNAL PROCESSOR AND METHOD FOR PROCESSING IMAGE SIGNAL
2y 5m to grant Granted Mar 31, 2026
Patent 12581211
IMAGING CIRCUIT AND IMAGING DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12581179
CAMERA MODULE AND VEHICLE COMPRISING SAME
2y 5m to grant Granted Mar 17, 2026
Patent 12568319
Image device capable of switching between global shutter mode and dynamic vision sensor mode
2y 5m to grant Granted Mar 03, 2026
Patent 12563313
IMAGE SENSING DEVICE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
61%
Grant Probability
90%
With Interview (+28.2%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 469 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month