Prosecution Insights
Last updated: April 19, 2026
Application No. 18/729,665

SYSTEMS AND METHODS OF ADAPTIVE PHASE DETECTION AUTOFOCUS OFFSET CORRECTION

Non-Final OA §102§103
Filed
Jul 17, 2024
Examiner
FLOHRE, JASON A
Art Unit
2637
Tech Center
2600 — Communications
Assignee
Qualcomm Incorporated
OA Round
1 (Non-Final)
69%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
87%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
496 granted / 720 resolved
+6.9% vs TC avg
Strong +18% interview lift
Without
With
+17.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
25 currently pending
Career history
745
Total Applications
across all art units

Statute-Specific Performance

§101
3.5%
-36.5% vs TC avg
§103
53.3%
+13.3% vs TC avg
§102
24.4%
-15.6% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 720 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 9-15, 17 and 25-30 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kurisu (United States Patent Application Publication 2019/0089892). Regarding claim 1, Kurisu discloses an apparatus for imaging, the apparatus comprising: at least one memory (figure 1 exhibits ROM 137 as disclosed at paragraph 28); and one or more processors coupled to the at least one memory (figure 1 exhibits processor comprising focus detection signal processing unit 125 and control unit 141 as disclosed at paragraph 30), the one or more processors configured to: receive image data captured by an image sensor according to one or more image capture settings, wherein the image data includes focus pixel data (paragraph 26 discloses receiving picture data which includes pairs of picture signals for focus detection); determine a first focus setting based on phase detection using the focus pixel data (figure 3 exhibits step S303 in which a first focus setting L1 is determined as disclosed at paragraphs 33 and 68); determine a second focus setting at least in part by adjusting the first focus setting according to a focus offset that is based on the one or more image capture settings (figure 3 exhibits step S308 in which a second focus setting L4 is determined by adjusting the first focus setting based on the aperture as disclosed at paragraph 66); and cause a focus control mechanism to set a focus parameter to the second focus setting (figure 8 exhibits step S805 in which the lens is moved as disclosed at paragraph 67). Regarding claim 9, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein, to determine the first focus setting based on phase detection using the focus pixel data, the one or more processors are configured to identify a phase difference between a first focus dataset of the focus pixel data and a second focus dataset of the focus pixel data, wherein the first focus dataset is associated with a first focus pixel of the image sensor, wherein the second focus dataset is associated with a second focus pixel of the image sensor (figure 2 exhibits a pixel array with a plurality of phase detection pixels that generate A images and B images, these correspond to the claimed first and second focus datasets. Because each pixel is associated with both focus datasets, the first focus dataset is associated with the top left pixel and the second focus dataset is associated with the pixel next to the top left pixel). Regarding claim 10, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein the one or more image capture settings include a third focus setting that is distinct from the second focus setting, wherein, to cause the focus control mechanism to set the focus parameter to the second focus setting, the one or more processors are configured to cause the focus control mechanism to adjust the focus parameter from the third focus setting to the second focus setting (figure 6 exhibits focus setting L2 which is an intermediate position between focus setting L1 and focus setting L4 as disclosed at paragraph 68). Regarding claim 11, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein, to cause the focus control mechanism to set the focus parameter to the second focus setting, the one or more processors are configured to cause actuation of a linear actuator of the focus control mechanism to move a lens from a first lens position to a second lens position that corresponds to the second focus setting, wherein the image data is captured based on light passing through the lens and reaching the image sensor (figure 1 shows that the lens moves perpendicular to the image sensor plane to adjust focusing, therefore the actuator that performs the correction can be considered to be linear; figure 1 exhibits wherein light passes through the diaphragm 113 through lens 114 and reaching image sensor 121; paragraph 66 teaches moving the lens from L1 to L4 in the optical axis direction). Regarding claim 12, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein, to cause the focus control mechanism to set the focus parameter to the second focus setting, the one or more processors are configured to cause actuation of a linear actuator of the focus control mechanism to move a lens in a direction that is perpendicular to an image plane of the image sensor (figure 1 shows that the lens moves perpendicular to the image sensor plane to adjust focusing, therefore the actuator that performs the correction can be considered to be linear). Regarding claim 13, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein the one or more image capture settings include at least one of aperture size, temperature, lux, lens position, or region of interest (figure 3 exhibits step S308 in which a second focus setting L4 is determined by adjusting the first focus setting based on the aperture as disclosed at paragraph 66). Regarding claim 14, Kurisu discloses the apparatus of claim 1, in addition, Kurisu discloses wherein the one or more processors are configured to: receive secondary image data captured by the image sensor according to the second focus setting (figure 3 exhibits step S310 in which image capturing is performed as disclosed at paragraph 39); and output the secondary image data (paragraph 27 teaches displaying image data). Regarding claim 15, Kurisu discloses the apparatus of claim 14, in addition, Kurisu discloses a display interface (figure 1 exhibits display control unit 132 as disclosed at paragraph 27), wherein, to output the secondary image data, the one or more processors are configured to cause the secondary image data to be displayed using a display at least in part by sending the secondary image data to the display through the display interface (paragraph 27 teaches that image data is output to the display through display control unit 132). Claim 17 is a method variant of the apparatus of claim 1 and is rejected for reasons similar to those of claim 1. Claim 25 is a method variant of the apparatus of claim 9 and is rejected for reasons similar to those of claim 9. Claim 26 is a method variant of the apparatus of claim 10 and is rejected for reasons similar to those of claim 10. Claim 27 is a method variant of the apparatus of claim 11 and is rejected for reasons similar to those of claim 11. Claim 28 is a method variant of the apparatus of claim 12 and is rejected for reasons similar to those of claim 12. Claim 29 is a method variant of the apparatus of claim 13 and is rejected for reasons similar to those of claim 13. Claim 30 is a method variant of the apparatus of claim 14 and is rejected for reasons similar to those of claim 14. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2, 5, 8, 18, 21 and 24 are rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Shigeta (United States Patent Application Publication 2022/0174207). Regarding claim 2, Kurisu discloses the apparatus of claim 1, however, Kurisu fails to disclose wherein the one or more processors are configured to: determine the focus offset based on use of the one or more image capture settings as inputs to a trained model. Shigeta is a similar or analogous system to the claimed invention as evidenced Shigeta teaches an imaging device wherein the motivation of controlling a focus lens in a well-balanced manner would have prompted a predictable variation of Kurisu by applying Shigeta’s known principal of determine the focus offset based on use of the one or more image capture settings as inputs to a trained model (figure 6 exhibits a trained model which uses capture settings such as depth of focus in order to determine a focus offset, drive signal Y1, as disclosed at paragraph 128). In view of the motivations such as controlling a focus lens in a well-balanced manner one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Regarding claim 5, Kurisu in view of Shigeta discloses the apparatus of claim 2, in addition, Shigeta discloses wherein the trained model includes a decision tree (paragraph 232 teaches that the model may include a decision tree). Regarding claim 8, Kurisu in view of Shigeta discloses the apparatus of claim 2, in addition, Shigeta discloses wherein the trained model is trained using training data that is generated based on prior focus settings for the focus control mechanism that are determined by the one or more processors based on prior image capture settings for image capture using the image sensor (figure 15 exhibits a learning process for the model which includes step 1503 in which a drive command includes autofocus control as disclosed at paragraph 223; paragraph 226 teaches that capture settings including target focus position and depth of focus are used to determine a drive signal). Claim 18 is a method variant of the apparatus of claim 2 and is rejected for reasons similar to those of claim 2. Claim 21 is a method variant of the apparatus of claim 5 and is rejected for reasons similar to those of claim 5. Claim 24 is a method variant of the apparatus of claim 8 and is rejected for reasons similar to those of claim 8. Claims 3 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Shigeta and further in view of Abe (United States Patent Application Publication 2016/0373643). Regarding claim 3, Kurisu in view of Shigeta discloses the apparatus of claim 2, however, Kurisu fails to disclose wherein the trained model includes focus offset maps that include respective focus offsets corresponding to different regions of interest to focus on. Abe is a similar or analogous system to the claimed invention as evidenced Abe teaches an imaging device wherein the motivation of properly compensating defocus based on lens curvature would have prompted a predictable variation of Kurisu by applying Abe’s known principal of correcting focus using focus offset maps that include respective focus offsets corresponding to different regions of interest to focus on (figures 5B, 5C and 5D exhibit focus offset maps that have different corrections for different areas on the imaging plan as disclosed at paragraph 49). In view of the motivations such as properly compensating defocus based on lens curvature one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Claim 19 is a method variant of the apparatus of claim 3 and is rejected for reasons similar to those of claim 3. Claims 4 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Shigeta and further in view of Feng et al. (United States Patent Application Publication 2019/0222759), hereinafter referenced as Feng. Regarding claim 4, Kurisu in view of Shigeta discloses the apparatus of claim 2, however, Kurisu fails to disclose wherein the trained model includes a linear regression. Feng is a similar or analogous system to the claimed invention as evidenced Feng teaches a method for determining a focus position wherein the motivation of compensating for motion during focusing would have prompted a predictable variation of Kurisu by applying Feng’s known principal of using a linear regression to determine a focus point offset (figure 6 exhibits wherein a plurality of focus measurements are used to determine a final position as part of a linear regression as disclosed at paragraph 62). In view of the motivations such as compensating for motion during focusing one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Claim 20 is a method variant of the apparatus of claim 4 and is rejected for reasons similar to those of claim 4. Claims 6 and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Shigeta and further in view of Li et al. (United States Patent Application Publication 2021/0074016), hereinafter referenced as Li. Regarding claim 6, Kurisu in view of Shigeta discloses the apparatus of claim 2, however, Kurisu fails to disclose wherein the one or more processors are configured to: periodically retrain the trained model according to a schedule. Li is a similar or analogous system to the claimed invention as evidenced Li teaches an imaging device using a trained model wherein the motivation of improving the accuracy of the model over time would have prompted a predictable variation of Kurisu by applying Li’s known principal of periodically retraining a trained model according to a schedule (paragraph 45 teaches retraining a model on a periodic basis). In view of the motivations such as improving the accuracy of the model over time one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Claim 22 is a method variant of the apparatus of claim 6 and is rejected for reasons similar to those of claim 6. Claims 7 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Shigeta and further in view of Galor Gluskin et al. (United States Patent Application Publication 2018/0349378), hereinafter referenced as Galor Gluskin. Regarding claim 7, Kurisu in view of Shigeta discloses the apparatus of claim 2, however, Kurisu fails to disclose wherein the trained model is trained using training data that indicates one or more respective differences between one or more phase detection autofocus (PDAF) focus settings that are determined using PDAF and one or more corresponding contrast detection autofocus (CDAF) focus settings that are determined using CDAF with the one or more PDAF focus settings as respective starting points. Galor Gluskin is a similar or analogous system to the claimed invention as evidenced Galor Gluskin teaches an imaging device wherein the motivation of improving the accuracy of phase detection autofocusing would have prompted a predictable variation of Kurisu by applying Galor Gluskin’s known principal of calibrating a system using data that indicates one or more respective differences between one or more phase detection autofocus (PDAF) focus settings that are determined using PDAF and one or more corresponding contrast detection autofocus (CDAF) focus settings that are determined using CDAF with the one or more PDAF focus settings as respective starting points (figure 8 exhibits wherein based on a first phase detection focus position determined in step 915, a different between the phase detection focus position and a contrast focus position obtained in step 925 is determined and used to calibrate the phase detection focus process as disclosed at paragraph 71). In view of the motivations such as of improving the accuracy of phase detection autofocusing one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Claim 23 is a method variant of the apparatus of claim 7 and is rejected for reasons similar to those of claim 7. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Kurisu in view of Kim et al. (United States Patent Application Publication 2009/0015703), hereinafter referenced as Kim. Regarding claim 16, Kurisu discloses the apparatus of claim 14, however, Kurisu fails to disclose a communication interface, wherein, to output the secondary image data, the one or more processors are configured to send the secondary image data to a recipient device using the communication interface. Kim is a similar or analogous system to the claimed invention as evidenced Kim teaches an imaging device wherein the motivation of sharing images with friends and family would have prompted a predictable variation of Kurisu by applying Kim’s known principal of providing a communication interface, wherein, to output the secondary image data, the one or more processors are configured to send the secondary image data to a recipient device using the communication interface (figure 1 exhibits wireless communication unit 110 which can output captured still images to an external device as disclosed at paragraph 157). In view of the motivations such as sharing images with friends and family one of ordinary skill in the art would have implemented the claimed variation of the prior art system of Kurisu. Therefore, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention. Citation of Pertinent Art The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sato et al. (United States Patent Application Publication 2023/0055269) teaches a method for focus correction. Jung et al. (United States Patent Application Publication 2022/0311940) teaches a method for focus correction. Shimizu et al. (United States Patent Application Publication 2020/0412969) teaches a method for focus correction. Chino et al. (United States Patent Application Publication 2020/0412969) teaches a method for focus correction. Miyatani (United States Patent Application Publication 2020/0412969) teaches a method for focus correction. Kikuchi et al. (United States Patent Application Publication 2019/0387175) teaches a method for focus correction. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON A FLOHRE whose telephone number is (571)270-7238. The examiner can normally be reached Mon-Fri 8:00-3:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at 571-272-7564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. JASON A. FLOHRE Patent Examiner Art Unit 2637 /JASON A FLOHRE/ Patent Examiner, Art Unit 2637
Read full office action

Prosecution Timeline

Jul 17, 2024
Application Filed
Jan 02, 2026
Non-Final Rejection — §102, §103
Mar 24, 2026
Interview Requested
Apr 07, 2026
Response Filed
Apr 07, 2026
Applicant Interview (Telephonic)
Apr 09, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593150
IMAGE CAPTURING APPARATUS, CONTROL METHOD THEREOF, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12568308
PHOTOGRAPHING FRAME RATE CONTROL METHOD, ELECTRONIC DEVICE, CHIP SYSTEM, AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Patent 12554096
ACTUATOR ARRANGEMENT
2y 5m to grant Granted Feb 17, 2026
Patent 12501132
SET-TOP BOX WITH AN INTEGRATED OPTICAL SENSOR AND SYSTEM COMPRISING SUCH A SET-TOP BOX
2y 5m to grant Granted Dec 16, 2025
Patent 12439150
IMAGING DEVICE, IMAGING SYSTEM, SCOPE, ENDOSCOPE SYSTEM, AND IMAGING METHOD
2y 5m to grant Granted Oct 07, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
69%
Grant Probability
87%
With Interview (+17.7%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 720 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month