Prosecution Insights
Last updated: April 19, 2026
Application No. 18/432,712

CAMERA CALIBRATION METHOD AND APPARATUS

Non-Final OA §103§DP
Filed
Feb 05, 2024
Examiner
LIEW, ALEX KOK SOON
Art Unit
2674
Tech Center
2600 — Communications
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
88%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
95%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
957 granted / 1094 resolved
+25.5% vs TC avg
Moderate +7% lift
Without
With
+7.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
18 currently pending
Career history
1112
Total Applications
across all art units

Statute-Specific Performance

§101
8.6%
-31.4% vs TC avg
§103
44.7%
+4.7% vs TC avg
§102
13.5%
-26.5% vs TC avg
§112
3.0%
-37.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1094 resolved cases

Office Action

§103 §DP
DETAILED ACTION [1] Remarks I. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . II. Claims 1-20 are pending and have been examined, where claims 1-6 and 10-16 is/are rejected, claim 7-9 and 17-20 is/are objected to. Explanations will be provided below. III. Inventor and/or assignee search were performed and determined no double patenting rejection(s) is/are necessary. IV. Patent eligibility (updated in 2019) shown by the following: Claims 1-20 pass patent eligibility test because there is/are no limitation or a combination of limitations amounting to an abstract idea / the following limitations. Also, the following limitation or the combinations of the limitations: “estimating an error for a calibration parameter of the camera comprising at least one of a pitch, a roll, or a yaw so that the feature points projected into the world coordinate system satisfy a line parallel condition” effects a transformation or a reduction of a particular article to a different state or thing / adds a specific limitation(s) other than what is well-understood, routine and conventional in the field, or adding unconventional steps that confine the claim to a particular useful application and providing improvements to the technical field of camera calibration, which recite additional elements that integrate the judicial exception into a practical application and amounting significant more. V. There are no PCT associated with the current application. [2] Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. Use of the word “means” (or “step for”) in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function. Absence of the word “means” (or “step for”) in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function. Claim elements in this application that use the word “means” (or “step for”) are presumed to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Similarly, claim elements that do not use the word “means” (or “step for”) are presumed not to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Claim(s) 12-20 are not interpreted under 35 U.S.C. 112(f) or pre-AIA U.S.C. 112 6th paragraph because of the following reason(s): limitations are modified by sufficient structure or material for performing the claimed function. Claim(s) 1-11 does not require 35 U.S.C. 112(f) or pre-AIA U.S.C. 112 6th paragraph interpretation because they are method claims and / or they are CRM claims. Upon examination of the specification and claims, the examiner has determined, under the best understanding of the scope of the claim(s), rejection(s) under 35 U.S.C. 112(a)/(b) is not necessitated because of the following reasons: sufficient support are provided in the written description / drawings of the invention. [3] Grounds of Rejection Claim Rejections - 35 USC § 103 1. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 2. Claims 1-2, 4, 6, 10-13, and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kosaki (US 8842181) in view of Kim (US 20200250470). Regarding claim 1, Kosaki discloses a processor-implemented, the method comprising: obtaining a driving image captured by a camera mounted on a vehicle (see figures 3, 11, 12, 13, and 14 are cameras); segmenting line regions comprising straight lines from the captured driving image (see figures 5A-C, 2a, 2b and 2c are line segments); extracting feature points of the straight lines from the line regions (see figures 5A, A1 and A2 are feature points on the lines); PNG media_image1.png 237 910 media_image1.png Greyscale projecting the feature points of the straight lines into a world coordinate system (see column 3, lines 46-51, when a correspondence between a plurality of feature points of the image photographed by each of the cameras 11 to 14 and positions of the feature points on a three-dimensional coordinate system is established, the camera parameters can be obtained approximately by calculation, where the 3D coordinate system is the world coordinates). Kosaki is silent in disclosing estimating an error for a calibration parameter of the camera comprising at least one of a pitch, a roll, or a yaw so that the feature points projected into the world coordinate system satisfy a line parallel condition. Kim discloses estimating an error for a calibration parameter of the camera comprising at least one of a pitch, a roll, or a yaw so that the feature points projected into the world coordinate system satisfy a line parallel condition (see paragraph 78, adjusting pitch calibration module 150 to set the overestimated range and the underestimated range as shown above, and to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and the underestimated error ratio corresponding to the underestimated range. Herein, the overestimated error ratio may be a ratio of (r1) the number of first specific target objects whose first specific estimated target heights are included in the overestimated range to (r2) the number of the target objects, the pitch error is calculated using object features in the image, also see figure 3 below): PNG media_image2.png 159 508 media_image2.png Greyscale It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to include “estimating an error for a calibration parameter of the camera comprising at least one of a pitch, a roll, or a yaw so that the feature points projected into the world coordinate system satisfy a line parallel condition” because inaccurate orientation angles distort the reconstructed scene, causing real-world parallel lines, like road lanes, to appear skewed or converge incorrectly where such technique will avoid image classification errors. Regarding claim 2, Kosaki discloses the method of claim 1, wherein the obtaining of the driving image comprises: determining whether front lines recognized in the driving image are the straight lines (see figures 6A-6C, the straight lines are B1 and B2): and obtaining the driving image based on a result of the determining that the front lines are the straight lines (see figure 14, images are captured once the magnification factors and positions are determined in accordance with previous set vehicle display region, also the cameras on the vehicle continuously captures image around the vehicle which will occur after the initial image acquisition and also see column 15, lines 56-59, the vehicle 1 is preferable to move in a linear manner with a rudder angle as small as possible, a boundary detection method in the front camera 11 performed under the above assumption will be first described below, which implies camera to acquire images after movements). Regarding claim 4, Kosaki discloses the method of claim 1, wherein the obtaining of the driving image comprises: determining whether the vehicle is in a translational motion or a rotational motion (see figure 8A illustration below); and PNG media_image3.png 250 497 media_image3.png Greyscale obtaining the driving image based on a determination that the vehicle is in the translational motion (see column 15, lines 56-59, the vehicle 1 is preferable to move in a linear manner with a rudder angle as small as possible, a boundary detection method in the front camera performed under the above assumption will be first described below, which implies camera to acquire images after movements). Regarding claim 6, Kosaki discloses the method of claim 1, wherein the line parallel condition comprises at least one of: a first condition in which, among the feature points projected into the world coordinate system, front distances of two feature points corresponding to a same height of straight lines facing each other are equal (see figure 5A, 2a is a straight line front distances of two feature points corresponding to a same height of straight lines facing each other are equal A1 and A2 have equal distance to equal at the boundary of the image): PNG media_image4.png 266 541 media_image4.png Greyscale . Regarding claim 10, Kim discloses the method of claim 1, further comprising: calibrating the calibration parameter of the camera while the vehicle is driving based on the error of the calibration parameter of the camera (see paragraph 78, adjusting pitch calibration module 150 to set the overestimated range and the underestimated range as shown above, and to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and the underestimated error ratio corresponding to the underestimated range. Herein, the overestimated error ratio may be a ratio of (r1) the number of first specific target objects whose first specific estimated target heights are included in the overestimated range to (r2) the number of the target objects, the pitch error is calculated using object features in the image, also see figure 3 below): PNG media_image2.png 159 508 media_image2.png Greyscale . See the motivation for claim 1. Regarding claim 11, Kosaki discloses a non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method of claim 1 (see column 5, lines 46-49, image processor for edge extraction processing. That is, color image data is converted into gray-scale data). Regarding claim 12 see the rationale and rejection for claim 1. Also Kosaki includes camera and processor (see column 5, lines 44-48, determining of whether the vehicle is in the translational motion comprises determining whether the vehicle is in the translational motion based on at least one of a measured value of an inertial measurement unit). Regarding claim 13 see the rationale and rejection for claim 2. Regarding claim 16 see the rationale and rejection for claim 5. 3. Claims 3 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kosaki (US 8842181) in view of Kim (US 20200250470) and Suhling (US 20070255133). Regrading claim 3, the combination of Kosaki and Kim as a whole discloses all the limitations of claim 2, but is silent in disclosing the method of claim 2, wherein the determining of whether the front lines are the straight lines comprises: modeling the front lines with a polynomial; and determining whether the front lines are the straight lines by determining a coefficient of higher-order terms of 2nd-order or more in the polynomial. Suhling discloses the method of claim 2, wherein the determining of whether the front lines are the straight lines comprises: modeling the front lines with a polynomial (see paragraph 10, description of the envelope of the segments and of the center line of the segments by second-degree polynomials is particularly suitable for the representation in a mathematically closed form); and determining whether the front lines are the straight lines by determining a coefficient of higher-order terms of 2nd-order or more in the polynomial (see paragraph 16, each component of the center line of a catheter segment is described by a second-degree polynomial, [0017] the radius of the envelope of each catheter segment is described by a second-degree polynomial). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to include modeling the front lines with a polynomial and determining whether the front lines are the straight lines by determining a coefficient of higher-order terms of 2nd-order or more in the polynomial because polynomial represents a straight line by checking if all coefficients for terms with degree 2 or higher are zero because higher-order terms introduce curves and bends which accurately estimate whether edges are curves or straight which improves image recognition. Regarding claim 14 see the rationale and rejection for claim 3. 4. Claims 5 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kosaki (US 8842181) in view of Kim (US 20200250470) and Wei (US 9028312). Regarding claim 5, the combination of Kosaki and Kim as a whole discloses all the limitations of claim 2, but is silent in disclosing the method of claim 4, wherein the determining of whether the vehicle is in the translational motion comprises determining whether the vehicle is in the translational motion based on at least one of a measured value of an inertial measurement unit (IMU) mounted on the vehicle, a wheel velocity of the vehicle, or a steering angle of the vehicle. Wei discloses Wei discloses the method of claim 4, wherein the determining of whether the vehicle is in the translational motion comprises determining whether the vehicle is in the translational motion based on at least one of a measured value of an inertial measurement unit (IMU) mounted on the vehicle, a wheel velocity of the vehicle, or a steering angle of the vehicle (see column 34, lines 10-20, Other types of sensors may be provided on the vehicle, the sensors may include inertial sensors, such as those provided in an inertial measurement unit (IMU). An IMU can include one or more accelerometers, one or more gyroscopes, one or more magnetometers, or suitable combinations thereof, the IMU can include up to three orthogonal accelerometers to measure linear acceleration of the movable object along up to three axes of translation, and up to three orthogonal gyroscopes to measure the angular acceleration about up to three axes of rotation). It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to include determining of whether the vehicle is in the translational motion comprises determining whether the vehicle is in the translational motion based on at least one of a measured value of an inertial measurement unit because these sensors directly capture the physical forces specifically linear acceleration—associated with movement along an axis, which will assist in predicting how said vehicle should move improving vehicle navigation. Regarding claim 15 see the rationale and rejection for claim 5. [4] Claim Objections Claim(s) 7-9 and 17-20 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. With regards to claim 7, the examiner cannot find any applicable prior art providing teachings for the following limitation(s): the method of claim 6, wherein: the feature points of the straight lines comprise two feature points corresponding to the same height of straight lines facing each other, and the estimating of the error comprises: calculating front distances to a front of the vehicle from each of the two feature points projected into the world coordinate system; and estimating an error of the roll so that the front distances satisfy the first condition; in combination with the rest of the limitations of claims 1 and 6. Kosaki discloses the method of claim 6, wherein: the feature points of the straight lines comprise two feature points corresponding to the same height of straight lines facing each other (see figure 5A illustration below), and PNG media_image5.png 252 454 media_image5.png Greyscale Kosaki is silent in disclosing the estimating of the error comprises: calculating front distances to a front of the vehicle from each of the two feature points projected into the world coordinate system; and estimating an error of the roll so that the front distances satisfy the first condition. Regarding claim 8, the examiner cannot find any applicable prior art providing teachings for the following limitation(s): the method of claim 6, wherein: the feature points of the straight lines comprise four feature points of a quadrangular shape located on straight lines facing each other; the estimating of the error comprises: calculating widths between two feature points facing each other corresponding to straight lines facing each other among the four feature points projected into the world coordinate system; and estimating an error of the pitch so that the widths satisfy the second condition; in combination with the rest of the limitations of claims 1 and 6. Kosaki discloses the method of claim 6, wherein: the feature points of the straight lines comprise two feature points of a quadrangular shape located on straight lines facing each other (see figure 5A illustration below), and PNG media_image6.png 257 521 media_image6.png Greyscale Kosaki is silent in disclosing the estimating of the error comprises: calculating widths between two feature points facing each other corresponding to straight lines facing each other among the four feature points projected into the world coordinate system; and estimating an error of the pitch so that the widths satisfy the second condition. Regarding claim 9, the examiner cannot find any applicable prior art providing teachings for the following limitation(s): the method of claim 6, wherein: the feature points of the straight lines comprise four feature points of a quadrangular shape located on straight lines facing each other, the estimating of the error comprises: calculating coordinates of center points of two feature points facing each other corresponding to straight lines facing each other among the four feature points projected into the world coordinate system; and estimating an error of the yaw so that the coordinates of the center points satisfy the third condition; in combination with the rest of the limitations of claims 1 and 6. Kosaki discloses the method of claim 6, wherein: the feature points of the straight lines comprise PNG media_image6.png 257 521 media_image6.png Greyscale . Kosaki is silent in disclosing the estimating of the error comprises: calculating coordinates of center points of two feature points facing each other corresponding to straight lines facing each other among the four feature points projected into the world coordinate system; and estimating an error of the yaw so that the coordinates of the center points satisfy the third condition. Regarding claim 17 see rationale for claim 7. Regarding claim 18 see rationale for claim 8. Regarding claim 19 see rationale for claim 9. Claim(s) 20 is/are objected as well because it is dependent on a claim with allowable subject matter. CONTACT INFORMATION Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEX LIEW (duty station is located in New York City) whose telephone number is (571)272-8623 (FAX 571-273-8623), cell (917)763-1192 or email alexa.liew@uspto.gov. Please note the examiner cannot reply through email unless an internet communication authorization is provided by the applicant. The examiner can be reached anytime. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, MISTRY ONEAL R, can be reached on (313)446-4912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALEX KOK S LIEW/Primary Examiner, Art Unit 2674 Telephone: 571-272-8623 Date: 12/21/25
Read full office action

Prosecution Timeline

Feb 05, 2024
Application Filed
Dec 13, 2025
Non-Final Rejection — §103, §DP
Mar 31, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597112
INSPECTION DEVICE, INSPECTION METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12597144
ANTERIOR SEGMENT ANALYSIS APPARATUS, ANTERIOR SEGMENT ANALYSIS METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12597150
OBTAINING A DEPTH MAP
2y 5m to grant Granted Apr 07, 2026
Patent 12579795
DIAGNOSIS SUPPORT SYSTEM, DIAGNOSIS SUPPORT METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12572999
INCREASING RESOLUTION OF DIGITAL IMAGES USING SELF-SUPERVISED BURST SUPER-RESOLUTION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
88%
Grant Probability
95%
With Interview (+7.2%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 1094 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month