Prosecution Insights
Last updated: April 19, 2026
Application No. 18/122,929

ONBOARD CAMERA CALIBRATION APPARATUS

Final Rejection §103
Filed
Mar 17, 2023
Examiner
TERRELL, EMILY C
Art Unit
2666
Tech Center
2600 — Communications
Assignee
Denso Ten Limited
OA Round
2 (Final)
59%
Grant Probability
Moderate
3-4
OA Rounds
2y 8m
To Grant
94%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
316 granted / 537 resolved
-3.2% vs TC avg
Strong +35% interview lift
Without
With
+35.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
18 currently pending
Career history
555
Total Applications
across all art units

Statute-Specific Performance

§101
4.2%
-35.8% vs TC avg
§103
54.8%
+14.8% vs TC avg
§102
20.9%
-19.1% vs TC avg
§112
15.8%
-24.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 537 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status In the application filed March 17, 2023, claim 1 was amended, claims 2-10 were cancelled, and claims 11-22 were added. In the remarks and amendments received August 18, 2025, claims 1, 12, 14, 15, 17, 18, and 22 were amended, claims 2-11, 13, 16, 19-21 are cancelled, and claim 23 was added. Accordingly, claims 1, 12, 14, 15, 17, 18, 22 and 23 are currently pending in the application for examination. Specification The objections to the Specification are removed in response to the remarks and amendments received September 18, 2025. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation is: “a controller” in claims 1 and 18 Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it is being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. Regarding “a controller”, the disclosure recites the following structure in [Paragraph 0048]: “The controller 15 is implemented by, for example, a central processing unit (CPU) or a micro processing unit (MPU) executing a computer program (not illustrated) according to the embodiment stored in the memory 14 with RAM as a work area. The controller 15 can be implemented by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).” If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 12, 14, 15, 18, 22 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over Oba et al. (Oba; U.S. Patent Application Publication 2018/0365859 A1). Regarding Claim 1, (Currently Amended) Oba discloses the aspects of the onboard camera calibration apparatus ([0090] FIG. 4 is a block diagram showing a specific example configuration of a camera calibration system as an image processing system to which the present technology is applied. The camera calibration system 21 is a system mounted on a vehicle.) comprising: a controller (image signal processing device 31; [0091] In the example shown in FIG. 4, the camera calibration system 21 includes not only the cameras 11-1 through 11-4 and the image signal processing device 31 shown in FIG. 2, but also a display unit 41, a communication network 42, a drive system control unit 43, an external information detection unit 44, and an environment sensor 45.) configured to: (i) set a first region of interest with a rectangular shape in a captured image captured by an onboard camera mounted in a vehicle ([0179] The example in FIG. 15 shows an overhead image 211, a long-sided cone 221 representing the field angle of the front camera (the camera 11-1) of the vehicle body 1, an imaged road surface 222 in a forward tilting state, a road surface 223 predicted at the time of shipment from the factory, and the like. As shown in FIG. 15, the surround-view overhead image 211 is obtained by converting a view field image of a region to be a rectangular region, for example, on the road surface from the wide angle camera 11-1.); (ii) set a second region of interest with a substantially trapezoidal shape (trapezoidal surface range) in which a region other than a road surface is removed from the first region of interest after performing a first calibration process of the onboard camera (buildings and distant line segments removed after initial factory calibration in the first calibration process, see ¶0179 above) based on an optical flow of first feature points extracted from the first region of interest ([0194] If it is determined in step S136 that the linearity determination marker line segment is not located on the road surface, the process returns to step S131, and the procedures that follow are also repeated for all the other detected lines. It should be noted that since indefinite line segments such as distant line segments and roadside buildings outside the road surface are unnecessary, the relevant process may be limited to a predictable road surface range such as a finite neighbor trapezoidal range. Further, at this stage, the vehicle closest line segment includes information with the highest accuracy, and therefore, line segment reliability weighting may be performed in accordance with the distances from the vehicle body.); (iii) perform a second calibration process of the onboard camera based on an optical flow of second feature points on the road surface ([0147] The optical flow filter 126 detects the velocity at which a specific number of points on the object move between the time series image frames, and supplies information about the detected velocity to the infinite point statistic calculator 127. The infinite point statistic calculator 127 performs an infinite point statistic calculation in which the velocity becomes 0 during linear motion in the in-screen set of the velocity vector, and detects an infinite point. For example, a feature point detected above the horizontal line viewed from a mounted camera flows upward as the feature point becomes closer. A feature point detected below the horizontal line flows downward as the feature point becomes closer. On the left side in the running direction of the in-vehicle cameras, a detected feature point flows to the left as the vehicle runs forward. On the right side, on the other hand, a detected feature point flows to the right. This flow is called an optical flow. That is, an infinite point is the coordinate position where the vector of the optical flow is 0 in the distribution of the optical flow in the vector screen, and the infinite point statistic calculator 127 supplies the infinite point offset calculation unit 129 with information about the infinite point detected through the statistic calculation of the detected optical flow.) extracted from the second region of interest ([0195] The parameter control unit 113 supplies the parameter optimization unit 114 with the parameters provisionally recorded in the camera external parameter offset value memory 134 and the information about the field angle shift converted by the field angle shift conversion unit 138. In step S137, if the supplied parameters show that there is a positional shift between adjacent cameras 11, the parameter optimization unit 114 corrects the vehicle body height and the vehicle body rotation to achieve continuity.); (iv) start a first image recognition process based on the captured image for driving support after completion of the first calibration process and before completion of the second calibration process (image recognition is performed after factory initial calibration in the first calibration of the vehicle and before the correction to the vehicle body height and rotation deemed the second calibration process or second recalibration); and (v) start a second image recognition process based on the captured image for the driving support after completion of the second calibration process (image area and images are used to recalibrate the vehicle cameras, height and rotation based on the images taken and the assessments made). Oba teaches several processes in the Camera Calibration System, as well as an Example Configuration of a Camera and the Image Processing IC, where the feature points and infinite point contribute to the recalibration process and necessity of recalibration (see ¶0147-0149) of one, or all cameras. Therefore, it would have been obvious to one of ordinary skill in the art prior to the effective filing date of the presently filed invention to modify the system of Oba with the teachings of the embodiments, as to detect a positional shift of a camera and correct the shift even when the vehicle is running. Regarding Claim 12, (Currently Amended) Oba further discloses the aspects of the onboard camera calibration apparatus according to claim 1, wherein the first calibration process is performed in the first region of interest with the rectangular shape (Initial calibration, see citations for claim 1 above), and the second calibration process is performed in the second region of interest with the substantially trapezoidal shape (recalibration, see citations for claim 1 above). Regarding Claim 14, (Currently Amended) Oba further discloses the aspects of the onboard camera calibration apparatus according to claim 1, wherein the controller is configured to prohibit performing the first image recognition process until the first calibration process is completed (image calibration is performed, and then the image recognition process begins again in each cycle of the calibration processes thus the first image calibration process is performed prior to the first image recognition process following the calibration, S64, [Processes in the Camera Calibration System] see Figure 6 and associated discussions of calibration procedures, see also [0149]). Regarding Claim 15, (Currently Amended) Oba further discloses the aspects of the onboard camera calibration apparatus according to claim 1, wherein the controller is configured to perform the first image recognition process during performing the second calibration process (See [Synchronization Process] where the cameras perform imaging, and the camera synchronization master signal management unit 92 and the difference management unit 93 synchronize as per Figure 8 discussion,[0161]-[0163]). Regarding Claim 18, (Currently Amended) Please see the rejection of claim 1 as articulated above, as the limitations of the method are interpreted and rejected in light of the steps met in claim 1 above. Regarding Claim 22, (Currently Amended) Oba further discloses the aspects of the non-transitory computer-readable recording medium storing a program that causes a computer of an onboard camera calibration apparatus to execute a process ([0237] The above described series of processes can be performed by hardware, or can be performed by software. In a case where the series of processes are to be performed by software, the program that forms the software is installed into a computer. Note that examples of the computer include a computer embedded in dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs therein. See also [0243]-[0249]), Please see the rejection of claim 1 as articulated above, as the limitations of the process are interpreted and rejected in light of the steps met in claim 1 above. Regarding Claim 23, (New) Oba further discloses the aspects of the onboard camera calibration apparatus according to claim 1, wherein the controller includes a memory that stores the captured image captured by the onboard camera ([0100] The display control unit 63 causes the display unit 41 to display an image corresponding to combined image signals that have been calibrated by the image processing IC 62 and been subjected to overhead-view transform and image combining in accordance with the internal parameters of the respective cameras 11 and external parameters. The memory 64 stores temporary data of the image processing IC 62. See also ¶0124). Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Oba et al. (Oba; U.S. Patent Application Publication 2018/0365859 A1) in view of Fridman (U.S. Patent Application Publication 20180024568 A1). Regarding Claim 17, (Currently Amended) Oba does not explicitly state the image recognition process is sent to a server, however Oba does teach, “[0223] The general-purpose communication interface 2620 is a general communication interface that mediates communication with various devices existing in external environments 2750. The general-purpose communication interface 2620 may implement a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX, LTE (Long Term Evolution), or LTE-A (LTE-Advanced), or some other wireless communication protocol such as wireless LAN (also called Wi-Fi (registered trademark)). The general-purpose communication interface 2620 may be connected to a device (an application server or a control server, for example) existing in an external network (the Internet, a cloud network, or a company-specific network, for example) via a base station or an access point. Alternatively, the general-purpose communication interface 2620 may be connected to a terminal (a terminal of a pedestrian or a shop, or a machine type communication (MTC) terminal, for example) existing in the vicinity of the vehicle, using the peer-to-peer (P2P) technology. See also ¶0249-0250.” In the same field of endeavor, systems and methods for navigating vehicles, Fridman teaches the aspects of the onboard camera calibration apparatus according to claim 1, wherein the controller is configured to allow results of the first image recognition process and the second image recognition process to be sent to a server, “[0274] The server may average landmark properties received from multiple vehicles that traveled along the common road segment, such as the distances between one landmark to another (e.g., a previous one along the road segment) as measured by multiple vehicles, to determine an arc-length parameter and support localization along the path and speed calibration for each client vehicle. The server may average the physical dimensions of a landmark measured by multiple vehicles traveled along the common road segment and recognized the same landmark. The averaged physical dimensions may be used to support distance estimation, such as the distance from the vehicle to the landmark. The server may average lateral positions of a landmark (e.g., position from the lane in which vehicles are travelling in to the landmark) measured by multiple vehicles traveled along the common road segment and recognized the same landmark. The averaged lateral potion may be used to support lane assignment. The server may average the GPS coordinates of the landmark measured by multiple vehicles traveled along the same road segment and recognized the same landmark. The averaged GPS coordinates of the landmark may be used to support global localization or positioning of the landmark in the road model. [0324] In some embodiments, generic visual features may be used as landmarks for the purpose of registering the position and orientation of a moving vehicle, in one drive (localization phase), relative to a map generated by vehicles traversing the same stretch of road in previous drives (mapping phase). These vehicles may be equipped with calibrated cameras imaging the vehicle surroundings and GPS receivers. The vehicles may communicate with a central server (e.g., server 1230) that maintains an up-to-date map including these visual landmarks connected to other significant geometric and semantic information (e.g. lane structure, type and position of road signs, type and position of road marks, shape of nearby drivable ground area delineated by the position of physical obstacles, shape of previously driven vehicle path when controlled by human driver, etc.). The total amount of data that may be communicated between the central server and vehicles per length of road is small, both in a mapping and localization phases.” Therefore, it would have been obvious to one of ordinary skill in the art prior to the effective filing date of the presently filed invention to modify the system of Oba with the teachings of Fridman in order to maintain “an up-to-date map including these visual landmarks connected to other significant geometric and semantic information (e.g. lane structure, type and position of road signs, type and position of road marks, shape of nearby drivable ground area delineated by the position of physical obstacles, shape of previously driven vehicle path when controlled by human driver, etc.” as taught by Fridman. Response to Arguments and Amendments Applicant’s arguments with respect to independent claims and dependent claims have been considered but are moot because the new ground of rejection does not rely on the combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument, as facilitated by the newly added amendments. Conclusion The prior art cited but not used in the present rejection which is considered extremely pertinent to Applicant’s invention: WATANABE US 20200051282 A1: [0073] In the present embodiment, measurement of three-dimensional information using the stereo camera 102 is performed by the parallel stereo method. The measurement principle of three-dimensional information obtained from the stereo camera 102 will be described with reference to FIG. 3. PNG media_image1.png 490 394 media_image1.png Greyscale Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Emily C Terrell whose telephone number is (571)270-3717. The examiner can normally be reached Monday - Thursday 7 a.m.-4 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /EMILY C TERRELL/ Supervisory Patent Examiner, Art Unit 2666
Read full office action

Prosecution Timeline

Mar 17, 2023
Application Filed
Jul 31, 2023
Response after Non-Final Action
May 12, 2025
Non-Final Rejection — §103
Aug 18, 2025
Response Filed
Feb 28, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586167
MEDICAL IMAGE PROCESSING APPARATUS AND MEDICAL IMAGE PROCESSING METHOD
2y 5m to grant Granted Mar 24, 2026
Patent 12573072
SYSTEM AND METHOD FOR OBJECT DETECTION IN DISCONTINUOUS SPACE
2y 5m to grant Granted Mar 10, 2026
Patent 12561956
AFFORDANCE-BASED REPOSING OF AN OBJECT IN A SCENE
2y 5m to grant Granted Feb 24, 2026
Patent 12518397
AUTOMATED DETERMINATION OF A BASE ASSESSMENT FOR A POSE OR MOVEMENT
2y 5m to grant Granted Jan 06, 2026
Patent 12493960
USER INTERFACE FOR VISUALIZING DIFFERENCES BETWEEN MEDICAL IMAGE CONTOURINGS
2y 5m to grant Granted Dec 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
59%
Grant Probability
94%
With Interview (+35.4%)
2y 8m
Median Time to Grant
Moderate
PTA Risk
Based on 537 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month