Prosecution Insights
Last updated: April 19, 2026
Application No. 18/632,210

FOCUS CONTROL DEVICE, IMAGING APPARATUS, FOCUS CONTROL METHOD, AND PROGRAM

Non-Final OA §102
Filed
Apr 10, 2024
Examiner
CALDERON, CYNTHIA
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Fujifilm Corporation
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
96%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
602 granted / 782 resolved
+15.0% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
17 currently pending
Career history
799
Total Applications
across all art units

Statute-Specific Performance

§101
4.9%
-35.1% vs TC avg
§103
42.1%
+2.1% vs TC avg
§102
30.7%
-9.3% vs TC avg
§112
11.9%
-28.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 782 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority 2. Receipt is acknowledged of certified copies of documents required by 37 CFR 1.55. Information Disclosure Statement 3. The information disclosure statement (IDS) submitted on 04/10/2024 is in compliance with the provisions of 37 CFR 1.97 and was considered by the examiner. Claim Rejections - 35 USC § 102 4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 5. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 6. Claims 1-17 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Nishikawa et al. (US-PGPUB 2022/0353423). Regarding claim 1, Nishikawa discloses a focus control device (Camera system 10; see fig. 1 and paragraph 0030) comprising: a processor (Camera MPU 125; see fig. 1 and paragraph 0036); and a memory (ROM 125a and memory 118; see fig. 1 and paragraphs 0041, 0036), wherein the processor is configured to: acquire an image signal output from an imaging element (The image sensor 122 photoelectrically converts the subject image formed via the imaging optical system, and outputs an imaging signal and a focus detection signal as the image data; see paragraph 0037. The phase difference AF unit 129 performs focus detection processing of a phase difference detection method based on image signals (signals for phase difference AF) of image data for focus detection that are obtained from the image sensor 122 and the image processing circuit 124; see paragraph 0043); set a focusing target region in an imaging region based on output information from an operating device that receives an operation by a user (Setting AF frame 802/1102 (arbitrary region) selected by the user on an image capturing screen 801/1101; see paragraphs 0069, 0096 and figs. 7, 8A, 11A); determine a search region based on the focusing target region (A calculation region 803 is set around an arbitrary AF frame 802; see figs. 7, 8A, 8B and paragraph 0069. Fig. 13A illustrates the set calculation region; see paragraph 0099); detect an object region including a specific object from the search region (Detecting a subject from a subject class region 804; see figs. 7, 8B and paragraph 0071 Or Setting a subject recognition frame 1103 in the calculation region represented as a double square frame which indicates an index of a subject recognition region, subject 1104; see figs. 7, 11B, 13A and paragraphs 0096, 0099); detect an overlapping region in which the focusing target region and the object region overlap each other (Because the subject recognition frame 1103 and the AF frame 1102 overlap, the overlap region 1106 exists; see figs. 11B, 13A-13B and paragraphs 0096-0098); and perform focus control based on the image signal of the overlapping region (In a case where it is determined that the AF frame and the subject recognition frame overlap, then in step S1004, a main subject region is selected from a region in the subject recognition frame. The overlap determination performed in step S1002 of FIG. 10 is important to determine whether a region desired by the user to be in focus exists in the AF frame or in the subject recognition region and select the region; see paragraphs 0098-0099. After the region is selected, focus steps S709, S712-S714 are executed, see fig. 7 and paragraphs 0077,0081-0083). Regarding claim 2, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the focusing target region includes a plurality of blocks (AF frame 802/1203 has a plurality of squares; see figs. 8A, 8B, 12B, 13A and paragraphs 0069, 0098-0099), and the processor is configured to detect one or a plurality of blocks that overlap the object region among the plurality of blocks, as the overlapping region (A square 1301 is a region corresponding to one calculation region, a horizontal length 1302 and a vertical length 1303 of a calculation region are each indicated by a curly bracket. Each curly bracket indicates a horizontal length 1304 of the overlap region of the region in the AF frame and the subject recognition region and a vertical length 1305 of the overlap region. In the overlap determination, the size of the overlap region where the region in the AF frame and the subject recognition region are determined to overlap can be defined by comparing the vertical and horizontal lengths of the overlap region and the vertical and horizontal lengths of the calculation region, and both of the vertical and horizontal lengths of the overlap region can be defined to be longer than those of the calculation region; see paragraph 0099 and figs. 13A-13B). Regarding claim 3, Nishikawa discloses everything claimed as applied above (see claim 2). In addition, Nishikawa discloses the processor is configured to detect one or a plurality of blocks at which an overlap ratio with the object region is equal to or higher than a threshold value among the plurality of blocks, as the overlapping region (In a case where the horizontal length 1304 of the overlap region is longer than the horizontal length 1302 of the calculation region and the vertical length 1305 of the overlap region is longer than the vertical length 1303 of the calculation region, it can be determined that the region in the AF frame and the subject recognition region overlap; see figs. 13A-13B and paragraph 0099). Regarding claim 4, Nishikawa discloses everything claimed as applied above (see claim 3). In addition, Nishikawa discloses the processor is configured to change the threshold value according to a type of the specific object (The portion determined to overlap can be changed to a subject recognition region with a higher priority order. Even if a size of an overlap region of a subject recognition region of a portion with a higher priority order and a region in an AF frame is smaller than the lower limit overlap size, the portion can be determined to overlap; see paragraph 0102). Regarding claim 5, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to detect a region in which an overlap ratio of the focusing target region and the object region is equal to or higher than a threshold value, as the overlapping region (In a case where the horizontal length 1304 of the overlap region is longer than the horizontal length 1302 of the calculation region and the vertical length 1305 of the overlap region is longer than the vertical length 1303 of the calculation region, it can be determined that the region in the AF frame and the subject recognition region overlap; see figs. 13A-13B and paragraph 0099). Regarding claim 6, Nishikawa discloses everything claimed as applied above (see claim 5). In addition, Nishikawa discloses the processor is configured to, in a case where the overlap ratio is lower than the threshold value, perform focus control based on the image signal of the focusing target region (In a case where the lower limit overlap size is small, the subject recognition region can be used if the region in the AF frame and the subject recognition region overlap slightly; see paragraph 0100). Regarding claim 7, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to determine the search region based on a long side of the focusing target region (The calculation region 803 is set around an arbitrary AF frame 802 (arbitrary region) selected by the user on an image capturing screen 801 so that multipoint defocus calculation can be performed in a range wider than the AF frame 802; see paragraph 0069). Regarding claim 8, Nishikawa discloses everything claimed as applied above (see claim 7). In addition, Nishikawa discloses the focusing target region has a rectangular shape (See AF frames shapes 802/1203 in figs. 8A-8B; 12B, 13A and paragraph 0002). Regarding claim 9, Nishikawa discloses everything claimed as applied above (see claim 2). In addition, Nishikawa discloses the processor is configured to, in a case where the focusing target region does not include a plurality of blocks, divide the focusing target region into the number of blocks according to a type of the specific object (Select the main subject region preferentially using subject detection class; see step s708, figs. 7 and 9 and paragraphs 0075-0076). Regarding claim 10, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to acquire a defocus amount for non-focus control based on the image signal of a region that is outside the overlapping region and is inside the search region (A defocus amount is calculated at each point in the defocus amount calculation region 803; see figs. 8A, 8B, 13A and paragraphs 0070, 0099). Regarding claim 11, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to highlight and display the overlapping region on a display device by changing a color of a frame of the overlapping region, a shape of the frame, or a line type of the frame (An overlap region 1106 illustrated in FIG. 11B is represented as a shaded portion; see fig. 11B and paragraph 0096). Regarding claim 12, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to detect the object region by inputting the image signal of the search region into a machine-trained model (Specific subject recognition detection is performed. As the detection method, a learning technique that uses known machine learning is used. In the learning technique that uses machine learning, a feature amount of each portion (face, pupil, entire body) of a subject (human, animal, etc.) is preliminarily learned, a preliminarily-learned subject is recognized from a captured image, and the region (or position and size) is acquired; see paragraphs 0071-0072. The subject recognition frame 1103 is a subject recognition frame set when a pupil of the subject 1104 is detected; see paragraphs 0096, 0046). Regarding claim 13, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to, in a case where the search region determined based on the focusing target region is smaller than a defined size, set a size of the search region to the defined size (The main subject region is set to a calculation region with a defocus amount closest to the anticipated defocus amount among calculation regions in the subject detection class in the AF frame; see figs. 7, 9 and paragraph 0094). Regarding claim 14, Nishikawa discloses everything claimed as applied above (see claim 1). In addition, Nishikawa discloses the processor is configured to, in a case where the detected object region is a specific portion, change a size of the search region according to a type or a size of the portion (The main subject region is set to a calculation region with a defocus amount closest to the anticipated defocus amount among calculation regions in the subject detection class; see paragraph 0093). Regarding claim 15, Nishikawa discloses an imaging apparatus (Camera system 10; see fig. 1 and paragraph 0030) comprising: the focus control device according to claim 1 (see the rejection of claim 1 above); the imaging element (Image sensor 122; see fig. 1 and paragraph 0037); and the operating device (Display device 126 including the image capturing screens; see fig. 1 and paragraphs 0042, 0069, 0096). Regarding claim 16, Nishikawa discloses a focus control method (see figs. 7, 10) performed by a processor (Camera MPU 125; see fig. 1 and paragraph 0036), the method comprising: acquiring an image signal output from an imaging element (The image sensor 122 photoelectrically converts the subject image formed via the imaging optical system, and outputs an imaging signal and a focus detection signal as the image data; see paragraph 0037. The phase difference AF unit 129 performs focus detection processing of a phase difference detection method based on image signals (signals for phase difference AF) of image data for focus detection that are obtained from the image sensor 122 and the image processing circuit 124; see paragraph 0043); setting a focusing target region in an imaging region based on output information from an operating device that receives an operation by a user (Setting AF frame 802/1102 (arbitrary region) selected by the user on an image capturing screen 801/1101; see paragraphs 0069, 0096 and figs. 7, 8A, 11A); determining a search region based on the focusing target region (A calculation region 803 is set around an arbitrary AF frame 802; see figs. 7, 8A, 8B and paragraph 0069. Fig. 13A illustrates the set calculation region; see paragraph 0099); detecting an object region including a specific object from the search region (Detecting a subject from a subject class region 804; see figs. 7, 8B and paragraph 0071 Or Setting a subject recognition frame 1103 in the calculation region represented as a double square frame which indicates an index of a subject recognition region, subject 1104; see figs. 7, 11B, 13A and paragraphs 0096, 0099); detecting an overlapping region in which the focusing target region and the object region overlap each other (Because the subject recognition frame 1103 and the AF frame 1102 overlap, the overlap region 1106 exists; see figs. 11B, 13A-13B and paragraphs 0096-0098); and performing focus control based on the image signal of the overlapping region (In a case where it is determined that the AF frame and the subject recognition frame overlap, then in step S1004, a main subject region is selected from a region in the subject recognition frame. The overlap determination performed in step S1002 of FIG. 10 is important to determine whether a region desired by the user to be in focus exists in the AF frame or in the subject recognition region and select the region; see paragraphs 0098-0099. After the region is selected, focus steps S709, S712-S714 are executed, see fig. 7 and paragraphs 0077,0081-0083). Regarding claim 17, Nishikawa discloses a non-transitory computer-readable storage medium (ROM 125a and memory 118; see fig. 1 and paragraphs 0041, 0036) storing a program causing a processor (Camera MPU 125; see fig. 1 and paragraph 0036) to execute a process comprising: acquiring an image signal output from an imaging element (The image sensor 122 photoelectrically converts the subject image formed via the imaging optical system, and outputs an imaging signal and a focus detection signal as the image data; see paragraph 0037. The phase difference AF unit 129 performs focus detection processing of a phase difference detection method based on image signals (signals for phase difference AF) of image data for focus detection that are obtained from the image sensor 122 and the image processing circuit 124; see paragraph 0043); setting a focusing target region in an imaging region based on output information from an operating device that receives an operation by a user (Setting AF frame 802/1102 (arbitrary region) selected by the user on an image capturing screen 801/1101; see paragraphs 0069, 0096 and figs. 7, 8A, 11A); determining a search region based on the focusing target region (A calculation region 803 is set around an arbitrary AF frame 802; see figs. 7, 8A, 8B and paragraph 0069. Fig. 13A illustrates the set calculation region; see paragraph 0099); detecting an object region including a specific object from the search region (Detecting a subject from a subject class region 804; see figs. 7, 8B and paragraph 0071 Or Setting a subject recognition frame 1103 in the calculation region represented as a double square frame which indicates an index of a subject recognition region, subject 1104; see figs. 7, 11B, 13A and paragraphs 0096, 0099); detecting an overlapping region in which the focusing target region and the object region overlap each other (Because the subject recognition frame 1103 and the AF frame 1102 overlap, the overlap region 1106 exists; see figs. 11B, 13A-13B and paragraphs 0096-0098); and performing focus control based on the image signal of the overlapping region (In a case where it is determined that the AF frame and the subject recognition frame overlap, then in step S1004, a main subject region is selected from a region in the subject recognition frame. The overlap determination performed in step S1002 of FIG. 10 is important to determine whether a region desired by the user to be in focus exists in the AF frame or in the subject recognition region and select the region; see paragraphs 0098-0099. After the region is selected, focus steps S709, S712-S714 are executed, see fig. 7 and paragraphs 0077,0081-0083). Citation of Pertinent Art 7. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. Kuribayashi (US-PGPUB 2018/0299645) discloses the focusing position determination unit selects an overlapped area of the setting areas for which the first evaluation unit evaluates that the evaluation value is equal to or greater than the evaluation threshold value and the setting areas for which the second evaluation unit evaluates that the evaluation value is equal to or greater than the evaluation threshold value, as the setting area to be used in the determination of the focusing position. In a case where the object H2 suddenly enters within an angle of view in a state in which the person H1 is captured, the focusing position is determined by selecting the overlapped area of the first evaluation areas and the second evaluation areas, and thus, it is possible to select the AF areas 53 in which the person H1 is formed. Contact Information 8. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 03/03/2026
Read full office action

Prosecution Timeline

Apr 10, 2024
Application Filed
Mar 03, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604088
IMAGE PICKUP APPARATUS CAPABLE OF CONTROLLING POWER SUPPLY, ITS CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12604108
LIGHTFIELD CAMERA THAT CAN SIMULTANEOUSLY ACQUIRE 2D INFORMATION AND 3D SPATIAL INFORMATION FROM SAME DEPTH
2y 5m to grant Granted Apr 14, 2026
Patent 12598388
IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12593120
METHOD FOR ACQUIRING A PHOTOGRAPHIC PORTRAIT OF AN INDIVIDUAL AND APPARATUS IMPLEMENTING THIS METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12587745
IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
96%
With Interview (+18.5%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 782 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month