Prosecution Insights
Last updated: April 19, 2026
Application No. 18/516,799

SPATIAL CALIBRATION METHOD

Non-Final OA §102§112
Filed
Nov 21, 2023
Examiner
BEATTY, TY MITCHELL
Art Unit
2663
Tech Center
2600 — Communications
Assignee
Canon Kabushiki Kaisha
OA Round
1 (Non-Final)
70%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
19 granted / 27 resolved
+8.4% vs TC avg
Strong +42% interview lift
Without
With
+42.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
15 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
7.1%
-32.9% vs TC avg
§103
42.8%
+2.8% vs TC avg
§102
27.1%
-12.9% vs TC avg
§112
23.1%
-16.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 27 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 1. Claims 1, 11, and 16 and, their respective dependent claims, are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 1 and 16 recite, “determining, iteratively”, and it is not clear which steps are iterative. Claims 1 and 16 recite, “displacing a position of a point of interest;”, and it is not clear which image is associated with “a point of interest” or if this point of interest is one of the previously identified points.. Claim 11 recites, “if one or more of the following conditions is/are fulfilled”, where “is/are” makes the claim indefinite. 2. Claim 2 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 2 has improper dependency since it removes a limitation, rather than further limiting the subject matter. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. 3. Claim 15 rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 15 has improper dependency in view of MPEP 608.01(n)III, where the method steps of claim 1 need not be performed even if the apparatus is capable of those steps. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements. Examiner’s Note 4. The Examiner was unable to find a special definition of “projection model” in the Specification of the present Application, and “projection model” is understood to be e.g. a different set of parameters or transformations used to match one image to another. Claim Rejections - 35 USC § 102 5. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 6. Claims 1-16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 20190197734 A1 by Forrest Briggs, (herein after “Briggs”). Regarding claim 1, as best understood, A method for spatial calibration of a first image against a second image, the method comprising (Briggs, §Abstract: “The camera rig system is calibrated based on the rectilinear images by adjusting a transformation of the images to reduce a displacement error of the identified sets of key points in a dimension.”): obtaining a plurality of pairs of points of interest, each pair matching a point of interest in the first image to a corresponding point of interest in the second image (Briggs, §Abstract: “A set of key points included in both images of an image pair is identified for each image pair.”); determining, iteratively, a projection model between the first and second images using pairs of points of interest of the obtained plurality (Briggs, Fig. 6 discloses an iterative process for determining a projection model, and P[0048]: “the optimization module 440 may repeat steps 608-620. In some embodiments, a calibration process is completed when the objective is achieved and a final parameter vector is obtained. Accordingly, the image processing system 100 may perform step 550 shown in FIG. 5 to transform the other set of images captured by the calibrated camera rig system 130.”); and calibrating the first image based on the determined projected model (Briggs, P[0048]: “the optimization module 440 may repeat steps 608-620. In some embodiments, a calibration process is completed when the objective is achieved and a final parameter vector is obtained. Accordingly, the image processing system 100 may perform step 550 shown in FIG. 5 to transform the other set of images captured by the calibrated camera rig system 130.”); wherein an iteration of the determining comprises: displacing a position of a point of interest (Briggs, §Abstract: “The optimization problem reduces a displacement error to align the key points of the rectilinear images by adjusting calibration of the cameras or a transform of the images (which corresponds to camera calibration). The image processing system may jointly rectify, i.e., calibrate, multiple cameras simultaneously.”, and Fig. 3B-3D which show displacement of a point of interest.); determining a projection model candidate using multiple pairs of points of interest, the multiple pairs including the point of interest with the displaced position (Briggs, Fig. 3B-3D shows projection model candidates.); and evaluating a reprojection error between the positions of the points of interest as projected using the projection model candidate and the corresponding points of interest (Briggs, Fig. 3B-3D shows projection model candidates and the displacement/reprojection error, where the goal is to minimize the error, see P[0004]: “An optimized parameter vector may be obtained when total displacement error of the matching key points in an equirectangular projection is minimized. The positions of the images may be used to determine calibration parameters.”); wherein the projection model is determined by choosing a projection model candidate based on the evaluated reprojection errors associated with a plurality of projection model candidates (Briggs, Fig. 3B-3D shows projection model candidates and the displacement/reprojection error, where the goal is to minimize the error, see P[0004]: “An optimized parameter vector may be obtained when total displacement error of the matching key points in an equirectangular projection is minimized. The positions of the images may be used to determine calibration parameters.”, where the chosen model is the model having the lowest displacement error.). Regarding claim 2, as best understood, wherein an iteration of the determining comprises determining a projection model candidate using multiple pairs of points of interest as obtained at the obtaining step without displacing any position of one of their points of interest is disclosed by Briggs in P[0038]: “the optimization module 440 may use the parameter vector to adjust at least one transformation of the images to reduce a displacement error of the identified key points.”, and they may/might not reduce a displacement error since “may” is not definitive. Regarding claim 3, wherein the iterated steps of determining a projection model candidate and evaluating its associated reprojection error are repeated for different sets of positions of the plurality of points of interest of one of the first or second images until a completion condition is fulfilled is disclosed by Briggs where the optimization module 440 uses the multiple sets of positions including corners and keypoints as shown in Fig. 3B-3D, where the completion condition is given in P[0044]: “the optimization module 440 stops the gradient descent loop and obtains a final parameter vector {right arrow over (θ)} responsive to determining that a particular criteria is satisfied. Example criteria may include reducing the aggregate displacement error under a threshold error, or iterating for a certain number of rounds in the gradient decent loop” Regarding claim 4, wherein, for at least one point of interest, a set of positions are explored, the set of positions comprises an initial position of the point of interest as obtained and one or more other positions resulting from displacing a current position of the point of interest according to a displacement policy is disclosed by Briggs in Fig. 3B-3D where matching points and corner points are used and moved to explore the scene with the goal of obtaining the lowest value for displacement by connecting the matching key point pair, where the displacement policy is guided by the optimization module 440, §Abstract: “The optimization problem reduces a displacement error to align the key points of the rectilinear images by adjusting calibration of the cameras or a transform of the images (which corresponds to camera calibration)” Regarding claim 5, wherein the displacement policy defines a search area relatively to a reference position and one or more displacement lengths is disclosed by Briggs in Fig. 3B-3D where the search area is the area within the overlapping frames, where the measured displacement in Fig. 3C is optimized to be reduced to a minimum acceptable value, which are relative to the search area. Regarding claim 6, wherein the set of explored positions are the positions reachable using the one or more displacement lengths within the search area where the reference position is the initial position is disclosed by Briggs in Fig. 3B-3D where the search area is the area of the overlapping frames, and the explored positions are where the matching key point pair 370 of Fig. 3C moves to as shown in Fig. 3D where the displacement length is used within the search area to bring the matching key point pairs closer together. Regarding claim 7, wherein the set of explored positions (Briggs, Fig. 3B-3D) are the positions reachable using the one or more displacement lengths within a plurality of search areas determined iteratively (Briggs, Fig. 3B-3D show the search areas 360A and 360B which are updated iteratively using the displacement values and new search areas as the matching key points are brought closer together over iterations.), the reference position of an initial search area is the initial position and the reference position of a following search area is the position explored in a previous search area (Briggs, Fig. 3B-3D, the moving keypoint and frames create new areas as the keypoints are brought closer together.) for which the corresponding reprojection error is the lowest or below a threshold (Briggs, P[0044]: “the optimization module 440 stops the gradient descent loop and obtains a final parameter vector {right arrow over (θ)} responsive to determining that a particular criteria is satisfied. Example criteria may include reducing the aggregate displacement error under a threshold error, or iterating for a certain number of rounds in the gradient decent loop”). Regarding claim 8, further comprising selecting an enhanced position among the set of explored positions for a point of interest for which the corresponding reprojection error is the lowest or below a threshold (Briggs, P[0044]: “the optimization module 440 stops the gradient descent loop and obtains a final parameter vector {right arrow over (θ)} responsive to determining that a particular criteria is satisfied. Example criteria may include reducing the aggregate displacement error under a threshold error, or iterating for a certain number of rounds in the gradient decent loop”). Regarding claim 9, further comprising, if the enhanced position is equal to the reference position for the point of interest: updating the displacement policy (Briggs, P[0047]: “The optimization module 440 adds 614 regularization to the parameter vector to generate a regularized parameter vector. The optimization module 440 computes 616 a gradient with the regularized parameter vector. Using the computed gradient, the optimization module 440 updates 618 the parameter vector.”, and P[0048]: “The optimization module 440 determines 620 if an objective is achieved, e.g., whether an aggregate displacement error of key points in the rectilinear images is less than a threshold error. Responsive to determining that the objective is achieved, optimization module 440 exits 622 the gradient descent loop … a calibration process is completed when the objective is achieved and a final parameter vector is obtained. Accordingly, the image processing system 100 may perform step 550 shown in FIG. 5 to transform the other set of images captured by the calibrated camera rig system 130.”); and repeating the iterated steps of determining a projection model candidate and evaluating its associated reprojection error with a new set of positions to be explored determined according to the updated displacement policy, which describes continuous operation of the program after acceptable convergence between the matching keypoints are met which is disclosed by Briggs in P[0015]: “The camera rig system 130 is a multi-camera system designed to capture images or videos (e.g., media) of a local area or a local scene.”, where the program works for multiple frames of the video, which describes continuous operation. Regarding claim 10, wherein updating the displacement policy (Briggs, the parameter vector of P[0047]: “The optimization module 440 adds 614 regularization to the parameter vector to generate a regularized parameter vector. The optimization module 440 computes 616 a gradient with the regularized parameter vector. Using the computed gradient, the optimization module 440 updates 618 the parameter vector.”) comprises reducing the one or more displacement lengths and narrowing the search area (Briggs, P[0004]: “To solve the optimization problem, the image processing system may use a gradient descent loop to generate parameter vectors. A parameter vector contains information about, e.g., positions of the four corners of each rectilinear image, and is used to jointly rectify circumferentially arranged cameras of the multi-camera system. An optimized parameter vector may be obtained when total displacement error of the matching key points in an equirectangular projection is minimized. The positions of the images may be used to determine calibration parameters.”, and further in Fig. 3B-3D where it shows the displacement of the corners which reduces the search area to the overlapping region which contain the matching keypoints that are to be brought together using an iterative process.). Regarding claim 11, as best understood, wherein the completion condition is fulfilled if one or more of the following conditions is/are fulfilled: - all positions allowed by the displacement policy are explored; - error below a threshold; - number of iterations reached; - all positions of a search area explored; - all set of positions explored. Is disclosed by Briggs in P[0044]: “the optimization module 440 stops the gradient descent loop and obtains a final parameter vector {right arrow over (θ)} responsive to determining that a particular criteria is satisfied. Example criteria may include reducing the aggregate displacement error under a threshold error, or iterating for a certain number of rounds in the gradient decent loop” Regarding claim 12, wherein the completion condition is fulfilled if enhanced positions for all points of interest are equal to their corresponding reference positions where the completion condition is when the displacement between the matching points is below a threshold, where overlapping points constitute a displacement distance of 0 (zero), which is below any non-negative threshold, Briggs, P[0044]: “Example criteria may include reducing the aggregate displacement error under a threshold error” Regarding claim 13, wherein a projection model candidate is determined by solving an optimization problem using the multiple pairs of points of interest is disclosed by Briggs in the §Abstract: “The image processing system may generate a parameter vector by solving an optimization problem using a gradient descent loop and the key points. The optimization problem reduces a displacement error to align the key points of the rectilinear images by adjusting calibration of the cameras or a transform of the images (which corresponds to camera calibration).” Regarding claim 14, wherein the projection model is chosen as the projection model candidate associated with the lowest evaluated reprojection error is disclosed by Briggs in Fig. 3B-3D where it shows that it chooses the model with mitigated displacement error, and further in P[0044] “the optimization module 440 stops the gradient descent loop and obtains a final parameter vector {right arrow over (θ)} responsive to determining that a particular criteria is satisfied. Example criteria may include reducing the aggregate displacement error under a threshold error” Claim 15 recites features nearly identical to those recited in claim 1. Claim 15 is rejected for reasons analogous to those discussed above in conjunction with claim 1. Regarding claim 16, as best understood, A non-transitory computer-readable medium carrying a computer program to execute a method is disclosed by Briggs in P[0053]: “may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.” The rest of the features of claim 16 are recited nearly identically to those recited in claim 1. Claim 16 is rejected for reasons analogous to those discussed above in conjunction with claim 1. Conclusion 8. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20190043220 A1 by Avinash Kumar et al., is directed toward detecting the same feature across several frames between multiple cameras and using the features for matching to perform photometric calibration. 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TY M BEATTY whose telephone number is (703)756-5370. The examiner can normally be reached Mon-Fri: 8AM-4PM EST.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at (571) 272 - 3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TY MITCHELL BEATTY/Examiner, Art Unit 2663 /GREGORY A MORSE/Supervisory Patent Examiner, Art Unit 2698
Read full office action

Prosecution Timeline

Nov 21, 2023
Application Filed
Feb 05, 2026
Non-Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597275
VEHICLE INTERIOR MONITORING SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12579653
AUTOMATED METHOD FOR TOOTH SEGMENTATION OF THREE DIMENSIONAL SCAN DATA USING TOOTH BOUNDARY CURVE AND COMPUTER READABLE MEDIUM HAVING PROGRAM FOR PERFORMING THE METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12555212
OBJECT DETECTION DEVICE AND METHOD FOR DETECTING MALFUNCTION OF OBJECT DETECTION DEVICE
2y 5m to grant Granted Feb 17, 2026
Patent 12511787
METHOD, DEVICE AND SYSTEM OF POINT CLOUD COMPRESSION FOR INTELLIGENT COOPERATIVE PERCEPTION SYSTEM
2y 5m to grant Granted Dec 30, 2025
Patent 12511750
IMAGE PROCESSING METHOD AND APPARATUS BASED ON IMAGE PROCESSING MODEL, ELECTRONIC DEVICE, STORAGE MEDIUM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+42.3%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 27 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month