Prosecution Insights
Last updated: April 19, 2026
Application No. 18/543,727

METHOD, COMPUTER PROGRAM, AND DEVICE FOR DETERMINING A CALIBRATION MATRIX FOR A CAMERA

Non-Final OA §102§103§112
Filed
Dec 18, 2023
Examiner
SHIN, SOO JUNG
Art Unit
2667
Tech Center
2600 — Communications
Assignee
Elektrobit Automotive GmbH
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
527 granted / 604 resolved
+25.3% vs TC avg
Strong +16% interview lift
Without
With
+16.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
28 currently pending
Career history
632
Total Applications
across all art units

Statute-Specific Performance

§101
7.6%
-32.4% vs TC avg
§103
37.5%
-2.5% vs TC avg
§102
19.9%
-20.1% vs TC avg
§112
24.2%
-15.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 604 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure, materials, or acts to entirely perform the recited function. Such claim limitation(s) is/are: “an object recognition module, which is configured to detect” and “a computing module, which is configured to calculate” in claims 9-15. One of ordinary skill in the art would understand that the modules described in claims 9-15 have sufficient structure, material or acts to perform the recited function because the claims are directed to determining a calibration matrix for a camera to correct perspective distortion. Because this/these claim limitation(s) is/are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof. If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 4-5, 12-13, 15, 18-19, and 21 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 4, 12, and 18 recite the limitation “the features extracted.” There is no antecedent basis for this limitation in the claims. For the purpose of further examination, the limitation has been interpreted as “features extracted.” Claims 5, 13, and 19 depend from claims 4, 12, and 18, respectively, and therefore inherit this deficiency. Claims 15 and 21 recite the limitation “the method steps are repeated…” There is no antecedent basis for “the method steps” in the claims. For the purpose of further examination, claim 15 has been interpreted as “the calculation is repeated …” (to correspond to the computing module) and claim 21 has been interpreted as “the operations are repeated …” Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-4, 7-12, 15-18, and 21 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Huang et al. (“Video Stabilization with Distortion Correction for Wide-angle Lens Dashcam,” 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA), 13-16 December 2016), hereinafter referred to as Huang. Regarding claims 1, 8, and 9, Huang teaches a method, non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a computer, cause the computer to carryout operations, and device for determining a calibration matrix for a camera (Huang Abstract: “This paper proposes a video stabilization algorithm … to better stabilize consecutive images captured by wide-angle dashcam mounted on a vehicle”; Huang pg. 3 right column: “real driving videos” – the algorithm is run on a vehicle comprising a processor and memory in order to process the real driving videos), comprising the following steps/operations: detecting at least one object, which moves in a sequence of camera images between image areas having different levels of distortion (Huang Abstract & pg. 3 discussed above; Huang Fig. 1: “Feature extraction”; Huang pg. 2 left column: “Scale Invariant Feature Transform (SIFT) is performed on each image for the subsequence processing”; Huang pg. 2 right column: “For an input video, we detect image pair using standard SIFT”; Huang Fig. 2: the detected objects correspond to a plurality of vehicles in motion); calculating an expected perspective-related distortion of the at least one object (Huang Fig. 1: “Motion and Distortion Estimation”; Huang pg. 2 right column: “For every real camera, some amount of radial distortion is always present and for eq. (1) to be true, one must first undistort the image points … we use the one-parameter division model for radial distortion modeling”; Huang Fig. 3: “Input video –distortion→ distorted input video” ); determining an observed distortion of the at least one object (Hung Fig. 3: “Distortion Parameter & Camera motion estimation → Undistortion”; Huang pg. 2 right column: “By combining eq. (1) with the same undistortion model, we obtain the following equation relating image correspondences distorted with different amounts of radial distortion”); calculating a camera-related distortion of the at least one object from the observed distortion and the expected perspective-related distortion (Huang pg. 2 & Fig. 3 discussed above); and calculating a calibration matrix from the camera-related distortion (Huang Fig. 1: “Motion Compensation”; Huang pg. 3 right column: “the stabilization matrix … each image point, xi,t, which is undistorted by the estimated λt is transformed”). Regarding claims 2, 16, and 10, Huang teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 1, 8, and 9, wherein object recognition and object classification are carried out for the detection of the at least one object (Huang Fig. 2). Regarding claims 3, 17, and 11, Huang teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 1, 8, and 9, wherein distance and position of the at least one object are determined in successive camera images and the expected perspective-related distortion is calculated on the basis of distance and position (Huang pg. 2 right column: “the homogeneous coordinate of the detected and radially distorted image points”; Huang eq. (1)-(4): the equations require the positions of each feature point – Huang uses a polynomial radial distortion model which uses a polynomial distortion function to transform between distorted and undistorted pixels in correspondence to the radii, which is the Euclidean distance from the distortion center to a distorted point). Regarding claims 4, 18, and 12, Huang teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 1, 8, and 9, wherein feature tracking is carried out for the at least one object, and the features extracted during the feature tracking are used for calculating the expected perspective-related distortion and determining the observed distortion (Huang Fig. 1: “Feature Extraction”; Huang Fig. 2). Regarding claims 7, 21, and 15, Huang teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 1, 8, and 9, wherein the calculated calibration matrix is applied to the camera images and the method steps are repeated iteratively until a minimal camera-related distortion is achieved (Huang pg. 2 left column: “applied iteratively to remove outliers”). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 5, 13 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Huang et al. (2016 APSIPA, 13-16 December 2016), in view of Parchami et al. (US 2023/0145701 A1), hereinafter referred to as Huang and Parchami, respectively. Regarding claims 5, 19, and 13, Huang teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 4, 18, and 12, but does not appear to explicitly teach that vanishing points in the camera images are determined for the calculation of the expected perspective-related distortion Pertaining to the same field of endeavor, Parchami teaches that vanishing points in the camera images are determined for the calculation of the expected perspective-related distortion (Parchami ¶¶0023: “instructions to determine the first distance based on a distance, in pixels, from the vanishing point to the center of the bottom face, a distance, in pixels, from the vanishing point to the intersection, and the calibration parameters”; Parchami Fig. 6 & ¶¶0060: “Traffic flow analysis begins by determining a vanishing point xoo in the image 300. Vanishing point xoo can be determined by constructing a series of lines 604, 606, 608, 610 (dotted lines) along features known to be parallel in the real-world, i.e., traffic lanes on the roadway 210. The vanishing point xoo is the point where the lines 604, 606, 608, 610 meet due to perspective distortion in the image 300. Assume the problem is to determine a distance dx1,x2 between points x1 and x2 in image 300 in real-world coordinates. Then the problem can be solved by expressing the distance dx1,x2 as a cross-ratio invariance equation”). Huang and Parchami are considered to be analogous art because they are directed to image processing and camera calibration. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the video stabilization with distortion correction for wide-angle lens dashcam (as taught by Huang) to determine the vanishing point (as taught by Parchami) because the combination can solve the perspective distortion problem using a cross-ratio invariance (Parchami ¶¶0060). Allowable Subject Matter Claims 6, 14, and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claims 6, 20, and 14, the closest prior art (Huang) teaches the method, non-transitory computer-readable storage medium, and device as claimed in claims 1, 8, and 9, wherein the scaling of the pixel values increases from a central area of the camera images towards edge areas of the camera images (Huang pg. 2 right column: “A. Radial affine matrix estimation” – as noted above, Huang uses a polynomial radial distortion model which uses a polynomial distortion function to transform between distorted and undistorted pixels in correspondence to the radii, which is the Euclidean distance from the distortion center to a distorted point). An additional prior art reference (Usamentiaga et al., “Static Calibration for Line-Scan Cameras Based on a Novel Calibration Target,” IEEE Transactions on Instrumentation and Measurement, Vol. 71, 5015812, 2022, hereinafter Usamentiaga) further teaches that the calibration matrix is based on line by line and column by column scaling of the pixel values of the camera images (Usamentiaga Fig. 2: the calibration increases from the center to the edges, center is indicated by red +, the values increase column-by-column and line/row-by-line/row; further see Usamentiaga Fig. 4). However, the prior art, alone or in combination, does not appear to teach or suggest that the calibration matrix based on line-by-line and column-by-column scaling of the pixel values (the actual pixel values increase outward from the center, shown in Applicant’s Figs. 8-9). PNG media_image1.png 782 746 media_image1.png Greyscale PNG media_image2.png 798 954 media_image2.png Greyscale Usamentiaga Fig. 2 Usamentiaga Fig. 4 PNG media_image3.png 300 510 media_image3.png Greyscale PNG media_image4.png 302 510 media_image4.png Greyscale Application Fig. 8 Application Fig. 9 Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOO J SHIN whose telephone number is (571)272-9753. The examiner can normally be reached M-F; 10-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571)272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Soo Shin/Primary Examiner, Art Unit 2667 571-272-9753 soo.shin@uspto.gov
Read full office action

Prosecution Timeline

Dec 18, 2023
Application Filed
Mar 09, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602768
SURFACE DEFECT DETECTION MODEL TRAINING METHOD, AND SURFACE DEFECT DETECTION METHOD AND SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12586411
TARGET IDENTIFICATION DEVICE, ELECTRONIC DEVICE, TARGET IDENTIFICATION METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12586204
Detecting Optical Discrepancies In Captured Images
2y 5m to grant Granted Mar 24, 2026
Patent 12586216
METHOD OF DETERMINING A MOTION OF A HEART WALL
2y 5m to grant Granted Mar 24, 2026
Patent 12573021
ULTRASONIC DEFECT DETECTION AND CLASSIFICATION SYSTEM USING MACHINE LEARNING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+16.0%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 604 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month