Prosecution Insights
Last updated: April 19, 2026
Application No. 18/537,452

HYBRID AUTO FOCUS TRACKING METHOD AND HYBRID AUTO FOCUS TRACKING DEVICE

Non-Final OA §102§103
Filed
Dec 12, 2023
Examiner
WOLFSON, ETHAN NOAH
Art Unit
2673
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant

Examiner Intelligence

Grants only 0% of cases
0%
Career Allow Rate
0 granted / 0 resolved
-62.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
15 currently pending
Career history
15
Total Applications
across all art units

Statute-Specific Performance

§101
14.3%
-25.7% vs TC avg
§103
51.4%
+11.4% vs TC avg
§102
20.0%
-20.0% vs TC avg
§112
8.6%
-31.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 0 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file. Information Disclosure Statement The information disclosure statements (IDS) submitted on 12/12/2023 is being considered by the examiner. Drawings The drawings are objected to because Figure 5 step S540 “CALCULATING A A SECOND FOCUSING POSITION…” should read “CALCULATING A SECOND FOCUSING POSITION…” Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Specification The specification is objected to because of the following informalities: On page 6, line 1, “a graphics processor (CPU)…” should read “a graphics processor (GPU)…” Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim 1 and 10 are rejected under 35 U.S.C. 102 (a)(1)/(a)(2) as being unpatentable over KIM et al. (US 20210352215 A1), hereinafter referenced as KIM. Regarding claim 1, KIM explicitly teaches a hybrid auto focus tracking method (Fig. 10. Paragraph [0202]), the hybrid auto focus tracking method comprising: determining a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor (Fig. 6. Paragraph [0158]-KIM discloses the second focal position information may be information on a focal movement position for moving the focal position of the second focus lens 421 based on the phase difference acquired from a phase difference image of the second camera.); determining a second lens focusing position based on a zoom ratio of feature points within a region of interest in a scene image output by the phase detection image sensor (Fig. 6. Paragraph [0166]-KIM discloses the first focal position information may include focal position information of the first focus lens 411 corresponding to a magnification for a specific point or a subject distance to a specific point. At this time, when the first focal position information includes the distance to the subject to be photographed and the focal position of the first focus lens 411 corresponding to the currently set zoom magnification, the controller 440 may extract this and control the focal position of the first focus lens 411.); determining a final lens focusing position by fusing the first lens focusing position and the second lens focusing position (Fig. 6. Paragraph [0164]-KIM discloses the storage 430 may further store third focal position information for determining a focal movement position of the first focus lens 411 by using the second focal position information. The third focal position information matches focal position characteristics of the first focus lens 411 and the second focus lens 421, and this refers to information on the focal position of the first focus lens 411 corresponding to the focal position of the second focus lens 421 acquired based on the matched focal position characteristic.); and performing focus tracking of the phase detection image sensor based on the final lens focusing position (Fig. 6. Paragraph [0169]-KIM discloses when the absolute value of the first phase difference is greater than the threshold value, it means that the first focal position information is incorrect, and accordingly, the controller 440 re-moves the focal position of the first focus lens 411 to the best position using the third focal position information.). Regarding claim 10, KIM explicitly teaches a hybrid auto focus tracking device (Fig. 10. Paragraph [0202]), the hybrid auto focus tracking device comprising: a memory (Fig. 6, #430 called storage. Paragraph [0154]), storing executable instructions (Fig. 6. Paragraph [0153]-KIM discloses the storage 430 stores information necessary for the operation of the camera module or information generated during the operation of the camera module.), and a processor (Fig. 6. #440 called a controller. Paragraph [0342) configured to execute the executable instructions to (Fig 6. Paragraph [0342]-KIM discloses the embodiments described herein may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, such embodiments may be implemented by the controller 180.): determine a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor (Fig. 6. Paragraph [0158]-KIM discloses the second focal position information may be information on a focal movement position for moving the focal position of the second focus lens 421 based on the phase difference acquired from a phase difference image of the second camera.); determine a second lens focusing position based on a zoom ratio of feature points within a region of interest in a scene image output by the phase detection image sensor (Fig. 6. Paragraph [0166]-KIM discloses the first focal position information may include focal position information of the first focus lens 411 corresponding to a magnification for a specific point or a subject distance to a specific point. At this time, when the first focal position information includes the distance to the subject to be photographed and the focal position of the first focus lens 411 corresponding to the currently set zoom magnification, the controller 440 may extract this and control the focal position of the first focus lens 411.); determine a final lens focusing position by fusing the first lens focusing position and the second lens focusing position (Fig. 6. Paragraph [0164]-KIM discloses the storage 430 may further store third focal position information for determining a focal movement position of the first focus lens 411 by using the second focal position information. The third focal position information matches focal position characteristics of the first focus lens 411 and the second focus lens 421, and this refers to information on the focal position of the first focus lens 411 corresponding to the focal position of the second focus lens 421 acquired based on the matched focal position characteristic.); and perform focus tracking of the phase detection image sensor based on the final lens focusing position (Fig. 6. Paragraph [0169]-KIM discloses when the absolute value of the first phase difference is greater than the threshold value, it means that the first focal position information is incorrect, and accordingly, the controller 440 re-moves the focal position of the first focus lens 411 to the best position using the third focal position information.). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2, 11, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over KIM et al. (US 20210352215 A1), hereinafter referenced as KIM, in view of GALOR GLUSKIN et al. (US 20180349378 A1), hereinafter referenced as GALOR GLUSKIN. Regarding claim 2, KIM explicitly teaches the hybrid auto focus tracking method of claim 1, KIM further explicitly teaches wherein the fusing the first lens focusing position and the second lens focusing position comprises (Fig. 6. Paragraph [0164]-KIM discloses the storage 430 may further store third focal position information for determining a focal movement position of the first focus lens 411 by using the second focal position information. The third focal position information matches focal position characteristics of the first focus lens 411 and the second focus lens 421, and this refers to information on the focal position of the first focus lens 411 corresponding to the focal position of the second focus lens 421 acquired based on the matched focal position characteristic.): KIM fails to explicitly teach determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fusing the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. However, GALOR GLUSKIN explicitly teaches determining a first confidence of the first lens focusing position (Fig. 8. Paragraph [0065]-GALOR GLUSKIN discloses the confidence level of the final lens position may be determined based on measurements associated with the particular auxiliary AF process being implemented. In DCIAF, the confidence level may be determined based on a number of detected corner points and the matching of the corner points between images. In TOFAF, the confidence level may be determined based on the amount of ambient light and the reflectance of a target object in the scene.), wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position (Fig. 8. Paragraph [0066]-GALOR GLUSKIN discloses when the confidence level of the sample is greater than a threshold confidence level, the method proceeds to block 815, where the processor 205 determines whether the estimated lens position is within a threshold distance from the final lens position determined by the main AF process.); determining a second confidence of the second lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses while the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214 (wherein the updated confidence level is the second confidence).), wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses the threshold confidence level may be any value defined to determine whether a given set of a lens position measurement and a phase difference measurement are sufficiently accurate to be used in calibration of the coefficient K. After the confidence level returns to a level that is greater than a threshold confidence level, the processor 205 may estimate a change in lens position required to move the lens into focus and send instructions to the actuator 212 to move the lens 210. While the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214.); and fusing the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence (Fig. 13. Paragraph [0090]-GALOR GLUSKIN discloses once the phase difference PD returns to a value of substantially zero, the processor 205 may determine that the image of the scene is in focus at the new lens position LP. However, in certain embodiments, the processor 205 may wait until the confidence level is above a threshold confidence level prior to calculating a sample calibration coefficient K. In certain implementations, the processor 205 may use the lens position LP and phase difference calculated at two different points to determine the sample calibration coefficient K.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of KIM of a hybrid auto focus tracking method, the hybrid auto focus tracking method comprising: determining a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor with the teachings of GALOR GLUSKIN determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fusing the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. Wherein having KIM’s camera with an autofocus system determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fusing the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. The motivation behind the modification would have been to obtain a hybrid autofocus tracking system that enhances the accuracy of determining the optimal focal position. Since both KIM and GALOR GLUSKIN employ autofocusing in a camera, wherein KIM it is possible to improve the accuracy of the focal position without being affected by a change in characteristics of an actuator that changes according to the number of times or time of use of the camera module, while GALOR GLUSKIN there is a need for improvement in the speed and/or accuracy of AF techniques to be used in imaging devices. Please see KIM et al. (US 20210352215 A1), Paragraph [0034], and GALOR GLUSKIN et al. (US 20180349378 A1), Paragraph [0002]. Regarding claim 11, KIM explicitly teaches the hybrid auto focus tracking device of claim 10, KIM further explicitly teaches wherein the processor is configured to (Fig. 6. Paragraph [0342]-KIM discloses the embodiments described herein may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.): KIM fails to explicitly teach determine a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determine a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fuse the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. However, GALOR GLUSKIN explicitly teaches determine a first confidence of the first lens focusing position (Fig. 8. Paragraph [0065]-GALOR GLUSKIN discloses the confidence level of the final lens position may be determined based on measurements associated with the particular auxiliary AF process being implemented. In DCIAF, the confidence level may be determined based on a number of detected corner points and the matching of the corner points between images. In TOFAF, the confidence level may be determined based on the amount of ambient light and the reflectance of a target object in the scene.), wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position (Fig. 8. Paragraph [0066]-GALOR GLUSKIN discloses when the confidence level of the sample is greater than a threshold confidence level, the method proceeds to block 815, where the processor 205 determines whether the estimated lens position is within a threshold distance from the final lens position determined by the main AF process.); determine a second confidence of the second lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses while the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214 (wherein the updated confidence level is the second confidence).), wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses the threshold confidence level may be any value defined to determine whether a given set of a lens position measurement and a phase difference measurement are sufficiently accurate to be used in calibration of the coefficient K. After the confidence level returns to a level that is greater than a threshold confidence level, the processor 205 may estimate a change in lens position required to move the lens into focus and send instructions to the actuator 212 to move the lens 210. While the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214.); and fuse the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence (Fig. 13. Paragraph [0090]-GALOR GLUSKIN discloses once the phase difference PD returns to a value of substantially zero, the processor 205 may determine that the image of the scene is in focus at the new lens position LP. However, in certain embodiments, the processor 205 may wait until the confidence level is above a threshold confidence level prior to calculating a sample calibration coefficient K. In certain implementations, the processor 205 may use the lens position LP and phase difference calculated at two different points to determine the sample calibration coefficient K.). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of KIM of a hybrid auto focus tracking device, the hybrid auto focus tracking device comprising: a memory, storing executable instructions, and a processor configured to execute the executable instructions to: determine a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor with the teachings of GALOR GLUSKIN of determine a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determine a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fuse the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. Wherein having KIM’s camera with an autofocus system determine a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determine a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; and fuse the first lens focusing position and the second lens focusing position as the final lens focusing position based on the first confidence and the second confidence. The motivation behind the modification would have been to obtain a hybrid autofocus tracking system that enhances the accuracy of determining the optimal focal position. Since both KIM and GALOR GLUSKIN employ autofocusing in a camera, wherein KIM it is possible to improve the accuracy of the focal position without being affected by a change in characteristics of an actuator that changes according to the number of times or time of use of the camera module, while GALOR GLUSKIN there is a need for improvement in the speed and/or accuracy of AF techniques to be used in imaging devices. Please see KIM et al. (US 20210352215 A1), Paragraph [0034], and GALOR GLUSKIN et al. (US 20180349378 A1), Paragraph [0002]. Regarding claim 19, KIM explicitly teaches a hybrid auto focus tracking method (Fig. 10. Paragraph [0202]), the hybrid auto focus tracking method comprising: determining a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor (Fig. 6. Paragraph [0158]-KIM discloses the second focal position information may be information on a focal movement position for moving the focal position of the second focus lens 421 based on the phase difference acquired from a phase difference image of the second camera.); determining a second lens focusing position based on a zoom ratio of feature points within a region of interest in a scene image output by the phase detection image sensor (Fig. 6. Paragraph [0166]-KIM discloses the first focal position information may include focal position information of the first focus lens 411 corresponding to a magnification for a specific point or a subject distance to a specific point. At this time, when the first focal position information includes the distance to the subject to be photographed and the focal position of the first focus lens 411 corresponding to the currently set zoom magnification, the controller 440 may extract this and control the focal position of the first focus lens 411.); performing refocusing based on the difference between the final lens focusing position and a lens current focusing position (Fig. 6. Paragraph [0169]-KIM discloses when an absolute value of the first phase difference is greater than a preset threshold value, the controller 440 re-determines the focal position of the first focus lens 411 using the third focal position information. That is, when the absolute value of the first phase difference is greater than the threshold value, it means that the first focal position information is incorrect, and accordingly, the controller 440 re-moves the focal position of the first focus lens 411 to the best position using the third focal position information. Here, the threshold value may be determined according to the characteristics of the first focus lens 411). KIM fails to explicitly teach determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; determining a final lens focusing position by fusing the first lens focusing position and the second lens focusing position based on the first confidence and the second confidence; and. However, GALOR GLUSKIN explicitly teaches determining a first confidence of the first lens focusing position (Fig. 8. Paragraph [0065]- GALOR GLUSKIN discloses the confidence level of the final lens position may be determined based on measurements associated with the particular auxiliary AF process being implemented. In DCIAF, the confidence level may be determined based on a number of detected corner points and the matching of the corner points between images. In TOFAF, the confidence level may be determined based on the amount of ambient light and the reflectance of a target object in the scene.), wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position (Fig. 8. Paragraph [0066]-GALOR GLUSKIN discloses when the confidence level of the sample is greater than a threshold confidence level, the method proceeds to block 815, where the processor 205 determines whether the estimated lens position is within a threshold distance from the final lens position determined by the main AF process.); determining a second confidence of the second lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses while the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214 (wherein the updated confidence level is the second confidence).), wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position (Fig. 13. Paragraph [0089]-GALOR GLUSKIN discloses the threshold confidence level may be any value defined to determine whether a given set of a lens position measurement and a phase difference measurement are sufficiently accurate to be used in calibration of the coefficient K. After the confidence level returns to a level that is greater than a threshold confidence level, the processor 205 may estimate a change in lens position required to move the lens into focus and send instructions to the actuator 212 to move the lens 210. While the actuator 212 is moving the lens towards the instructed lens position, the processor 205 may continue to update the lens position LP, phase difference PD, and confidence level parameters based on corresponding changes in the images captured by the image sensor 214.); determining a final lens focusing position by fusing the first lens focusing position and the second lens focusing position based on the first confidence and the second confidence (Fig. 13. Paragraph [0090]-GALOR GLUSKIN discloses once the phase difference PD returns to a value of substantially zero, the processor 205 may determine that the image of the scene is in focus at the new lens position LP. However, in certain embodiments, the processor 205 may wait until the confidence level is above a threshold confidence level prior to calculating a sample calibration coefficient K. In certain implementations, the processor 205 may use the lens position LP and phase difference calculated at two different points to determine the sample calibration coefficient K); and. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of KIM of A hybrid auto focus tracking method, the hybrid auto focus tracking method comprising: determining a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor; determining a second lens focusing position based on a zoom ratio of feature points within a region of interest in a scene image output by the phase detection image sensor with the teachings of GALOR GLUSKIN of determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; determining a final lens focusing position by fusing the first lens focusing position and the second lens focusing position based on the first confidence and the second confidence; and. Wherein having KIM’s camera with an autofocus system determining a first confidence of the first lens focusing position, wherein the first confidence indicates a reliability that the first lens focusing position is able to be the final lens focusing position; determining a second confidence of the second lens focusing position, wherein the second confidence indicates a reliability that the second lens focusing position is able to be the final lens focusing position; determining a final lens focusing position by fusing the first lens focusing position and the second lens focusing position based on the first confidence and the second confidence; and. The motivation behind the modification would have been to obtain a hybrid autofocus tracking system that enhances the accuracy of determining the optimal focal position. Since both KIM and GALOR GLUSKIN employ autofocusing in a camera, wherein KIM it is possible to improve the accuracy of the focal position without being affected by a change in characteristics of an actuator that changes according to the number of times or time of use of the camera module, while GALOR GLUSKIN there is a need for improvement in the speed and/or accuracy of AF techniques to be used in imaging devices. Please see KIM et al. (US 20210352215 A1), Paragraph [0034], and GALOR GLUSKIN et al. (US 20180349378 A1), Paragraph [0002]. Claims 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over KIM et al. (US 20210352215 A1), hereinafter referenced as KIM, in view of LEE et al. (US 20200244875 A1), hereinafter referenced as LEE. Regarding claim 9, KIM explicitly teaches the hybrid auto focus tracking method of claim 1, KIM further explicitly teaches wherein the performing the focus tracking of the phase detection image sensor comprises (Fig. 6. Paragraph [0198]-KIM discloses when the accuracy of the current focal position of the first camera is low, the correct focal position of the first camera is tracked by using the focal position information of the second camera according to the matching. As described above, in the embodiment according to the present invention, when implementing the auto focus function of the first camera, not only zoom tracking of the first camera but also the focal movement position is tracked using the focal position information of the second camera, and accordingly, accuracy can be improved.): performing the focus tracking of the phase detection image sensor based on the lens focusing position (Fig. 6. Paragraph [0169]-KIM discloses when the absolute value of the first phase difference is greater than the threshold value, it means that the first focal position information is incorrect, and accordingly, the controller 440 re-moves the focal position of the first focus lens 411 to the best position using the third focal position information.). KIM fails to explicitly teach obtaining the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. However, LEE explicitly teaches obtaining the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames (Fig. 7. Paragraph [0120]-LEE discloses electronic device may move a focus of an image from the image sensor and a lens position to an optical axis direction which interconnects a subject and the image sensor. The electronic device may acquire IIR filter information according to the lens which moves in the optical axis direction. The electronic device may set a position of the highest IIR filter value, to the focused position (wherein the IIR filter performs the smooth filtering process).); and Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of KIM of a hybrid auto focus tracking method, the hybrid auto focus tracking method comprising: determining a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor with the teachings of LEE of obtaining the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. Wherein having KIM’s camera with an autofocus system obtaining the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. The motivation behind the modification would have been to obtain a hybrid autofocus tracking system that enhances the accuracy of determining the optimal focal position. Since both KIM and LEE relate to cameras that employ image sensors to assist in focusing the camera, wherein KIM it is possible to improve the accuracy of the focal position without being affected by a change in characteristics of an actuator that changes according to the number of times or time of use of the camera module, while LEE the operating frequency of the ISPs may reduce from the operating frequency of the image sensor. Please see KIM et al. (US 20210352215 A1), Paragraph [0034], and LEE et al. (US 20200244875 A1), Paragraph [0182]. Regarding claim 18, KIM explicitly teaches the hybrid auto focus tracking device of claim 10, KIM further explicitly teaches wherein the processor is configured to (Fig. 6 Paragraph [0342]-KIM discloses the embodiments described herein may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.): perform the focus tracking of the phase detection image sensor based on the lens focusing position (Fig. 6. Paragraph [0169]-KIM discloses when the absolute value of the first phase difference is greater than the threshold value, it means that the first focal position information is incorrect, and accordingly, the controller 440 re-moves the focal position of the first focus lens 411 to the best position using the third focal position information.). KIM fails to explicitly teach obtain the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. However, LEE explicitly teaches obtain the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames (Fig. 7. Paragraph [0120]-LEE discloses electronic device may move a focus of an image from the image sensor and a lens position to an optical axis direction which interconnects a subject and the image sensor. The electronic device may acquire IIR filter information according to the lens which moves in the optical axis direction. The electronic device may set a position of the highest IIR filter value, to the focused position (wherein the IIR filter performs the smooth filtering process).); and Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to combine the teachings of KIM of a hybrid auto focus tracking device, the hybrid auto focus tracking device comprising: a memory, storing executable instructions, and a processor configured to execute the executable instructions to: determine a first lens focusing position based on a phase difference of a phase detection image output by a phase detection image sensor with the teachings of LEE of obtain the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. Wherein having KIM’s camera with an autofocus system obtain the lens focusing position of a current frame by filtering and smoothing the final lens focusing position of the current frame and the final lens focusing positions of one or more previous frames. The motivation behind the modification would have been to obtain a hybrid autofocus tracking system that enhances the accuracy of determining the optimal focal position. Since both KIM and LEE relate to cameras that employ image sensors to assist in focusing the camera, wherein KIM it is possible to improve the accuracy of the focal position without being affected by a change in characteristics of an actuator that changes according to the number of times or time of use of the camera module, while LEE the operating frequency of the ISPs may reduce from the operating frequency of the image sensor. Please see KIM et al. (US 20210352215 A1), Paragraph [0034], and LEE et al. (US 20200244875 A1), Paragraph [0182]. Allowable Subject Matter Claims 3-5, 12-14, 16-17, and 20, along with their dependent claims, 6, 15, are objected to as being dependent upon a rejected base claims, claim 1, claim 10, and claim 19, respectively, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims, once the drawing and specification objections are overcome. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 3, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 2, wherein the first confidence is determined based on one or more of a weight factor of signal-to-noise ratio, a texture weight factor, and a correlation weight factor during phase difference calculation, wherein the higher the signal-to-noise ratio of the phase detection image, the greater the weight factor of the signal-to-noise ratio, and the greater the first confidence, wherein the more complex a texture of the phase detection image, the greater the texture weight factor, and the greater the first confidence, wherein the smaller a correlation error of the phase detection image, the greater the correlation weight factor, and the greater the first confidence, as claim in claim 3. Regarding claim 4, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 2, wherein the smaller a fitting error when calculating the zoom ratio, the greater the second confidence, as claim in claim 4. Regarding claim 5, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 2, wherein the fusing the first lens focusing position and the second lens focusing position as the final lens focusing position comprises: obtaining an addition result by adding a product of the first lens focusing position and the first confidence to a product of the second lens focusing position and the second confidence; and calculating a resulting value obtained by dividing the addition result by a sum of the first confidence and the second confidence as the final lens focusing position, as claim in claim 5. Regarding claim 7, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 1, wherein the determining the first lens focusing position comprises: obtaining an addition result by adding a product of a first lens calibration coefficient and a lens current position to a product of a second lens calibration coefficient and the phase difference; and determining a result value obtained by dividing the addition result by a sum of the first lens calibration coefficient and the phase difference as the first lens focusing position, or the determining the first lens focusing position comprises: using a table lookup method to determine the first lens focusing position based on the phase difference, as claim in claim 7. Regarding claim 8, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 1, wherein the determining the second lens focusing position comprises: obtaining a first result by dividing the zoom ratio by a lens current position; obtaining a second result by dividing a difference between 1 and the zoom ratio by a focal length of a lens; and determining a reciprocal of a sum of the first result and the second result as the second lens focusing position, wherein the zoom ratio is obtained based on position fitting of the feature points in the region of interest and a camera projection model, as claim in claim 8. Regarding claim 12, the prior arts fail to explicitly teach the hybrid auto focus tracking device of claim 11, wherein the first confidence is determined based on one or more of a weight factor of signal-to-noise ratio, a texture weight factor, and a correlation weight factor during phase difference calculation, wherein the higher the signal-to-noise ratio of the phase detection image, the greater the weight factor of the signal-to-noise ratio, and the greater the first confidence, wherein the more complex a texture of the phase detection image, the greater the texture weight factor, and the greater the first confidence, wherein the smaller a correlation error of the phase detection image, the greater the correlation weight factor, and the greater the first confidence. Regarding claim 13, the prior arts fail to explicitly teach the hybrid auto focus tracking device of claim 11, wherein the smaller a fitting error when calculating the zoom ratio, the greater the second confidence, as claimed in claim 13. Regarding claim 14, the prior arts fail to explicitly teach the hybrid auto focus tracking device of claim 11, wherein the processor is configured to: obtain an addition result by adding a product of the first lens focusing position and the first confidence to a product of the second lens focusing position and the second confidence; and calculate a resulting value obtained by dividing the addition result by a sum of the first confidence and the second confidence as the final lens focusing position, as claimed in claim 14. Regarding claim 16, the prior arts fail to explicitly teach the hybrid auto focus tracking device of claim 10, wherein the processor is configured to: obtain an addition result by adding a product of a first lens calibration coefficient and a lens current position to a product of a second lens calibration coefficient and the phase difference; and determine a result value obtained by dividing the addition result by a sum of the first lens calibration coefficient and the phase difference as the first lens focusing position, or the processor is configured to: use a table lookup method to determine the first lens focusing position based on the phase difference, as claimed in claim 16. Regarding claim 17, the prior arts fail to explicitly teach the hybrid auto focus tracking device of claim 10, wherein the processor is configured to: obtain a first result by dividing the zoom ratio by a lens current position; obtain a second result by dividing a difference between 1 and the zoom ratio by a focal length of a lens; and determine a reciprocal of a sum of the first result and the second result as the second lens focusing position, wherein the zoom ratio is obtained based on position fitting of the feature points in the region of interest and a camera projection model, as claimed in claim 17. Regarding claim 20, the prior arts fail to explicitly teach the hybrid auto focus tracking method of claim 19, wherein the smaller a fitting error when calculating the zoom ratio, the greater the second confidence, as claimed in claim 20. Conclusion Listed below are prior arts made of record and not relied upon but are considered pertinent to applicant’s disclosure. GALOR GLUSKIN et al. (US 20180131862 A1) - Systems, methods, and devices for optimizing phase detection autofocus (PDAF) processing are provided. One aspect provides an apparatus comprising: an image sensor configured to capture image data of a scene; a buffer; and a processor. The processor may be configured to store the image data in the buffer as a current frame and divide the current frame into a plurality of windows each corresponding to a different spatial region of the scene. The processor may be further configured to identify a central portion of the current frame comprising a subset of the plurality of windows. The processor may be further configured to determine a depth value of the central portion based on performing PDAF on the subset of the plurality of windows and determine a confidence value for the central portion based on the depth value and image data corresponding to the subset of the plurality of windows. KADAMBALA et al. (US 20190335089 A1) - Methods, systems, and devices for dual phase detection auto focus (PDAF) power optimization are described. A camera device may capture a frame including a pixel array representing a scene. In some examples, each pixel of the pixel array may be a phase detection (PD) pixel having one or more values, or one or more PD pixels positioned randomly across the pixel array. The camera device may identify a configuration of the pixel array, and determine a condition of the pixel array relative to the configuration. The configuration may be a binning configuration or a frame pattern configuration, and the condition may include an illumination condition related to the pixel array representing the scene. The camera device may determine and apply a reconfiguration to at least a portion of the pixel array based on the condition of the pixel array, and determine a lens position of the camera device therewith. YOSHINO et al. (US 20140111628 A1) - A focus control device, an endoscope system, a focus control method, and the like can implement an autofocus operation in an appropriate state by setting a focus mode of an imaging optical system. The focus control device includes a focus control section that controls a focus of an imaging optical system that includes at least a zoom lens that adjusts an optical magnification, and sets a focus mode of the imaging optical system, and an image acquisition section that acquires an image through the imaging optical system, the focus mode including a fixed focus mode and an autofocus (AF) mode, and the focus control section switching the focus mode between the fixed focus mode and the AF mode corresponding to whether the zoom lens is positioned on a wide-angle side or a telescopic side relative to a reference point that is situated between a wide-angle end and a telescopic end. NAKAMARU et al. (US 20170366740 A1) - The focusing control device capable of preventing deterioration in precision of focusing control from being caused by an error in phase-difference depending on ambient temperature includes: an imaging element that outputs a pair of image signals deviated in one direction on the basis of one subject light image; a phase-difference detection section that detects a phase-difference between the pair of image signals; a temperature detection section; a correction section that corrects the in-focus position of the focus lens based on the detection phase-difference, which is the phase-difference detected by the phase-difference detection section, on the basis of the data in which the temperature, the focus lens position, and the information for in-focus position correction are associated with one another, the temperature which is detected by the temperature detection section, and the focus lens position; and a lens control section that moves the focus lens to the corrected in-focus position. ASANO (US 20080247742 A1) - An optical apparatus includes a detector detecting information used for focus control in each of a plurality of detection areas set in an image-pickup area, a selecting member being operated to change a selected area selected from the plurality of the detection areas, and a controller performing the focus control based on the information detected by the detector in the selected area. The controller determines a selectable detection area among the plurality of the detection areas based on the information used for the focus control detected in each of the detection areas and changes the selected area among the selectable detection areas in response to the operation of the selecting member. The optical apparatus reduces burdens in a selecting operation of the detection area for information used for the focus control and allows selection of an intended detection area quickly. SAKURABU (US 20180224629 A1) - A focusing control device includes: a sensor as defined herein; a first correlation value generation unit as defined herein; a second correlation value generation unit as defined herein; a first phase difference amount measurement unit as defined herein; a second phase difference amount measurement unit as defined herein; a target position determination unit as defined herein; and a lens driving control unit as defined herein, the target position determination unit calculates a temporary target position of the focus lens as defined herein, determines whether or not the target position of the focus lens based on the first phase difference amount falls within a predetermined depth range as defined herein, performs the first process in a case defined herein and performs the second process in a case defined herein, and, in the second process, the target position of the focus lens is determined as defined herein. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ETHAN N WOLFSON whose telephone number is (571)272-1898. The examiner can normally be reached Monday - Friday 8:00 am - 5:00 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chineyere Wills-Burns can be reached at (571) 272-9752. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ETHAN N WOLFSON/Examiner, Art Unit 2673 /CHINEYERE WILLS-BURNS/Supervisory Patent Examiner, Art Unit 2673
Read full office action

Prosecution Timeline

Dec 12, 2023
Application Filed
Jan 23, 2026
Non-Final Rejection — §102, §103
Feb 26, 2026
Interview Requested
Mar 10, 2026
Applicant Interview (Telephonic)
Mar 10, 2026
Examiner Interview Summary

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
Grant Probability
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 0 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month