Prosecution Insights
Last updated: April 19, 2026
Application No. 17/995,117

METHOD AND SYSTEM FOR CALCULATING VEHICLE TRAILER ANGLE

Non-Final OA §101§103
Filed
Sep 30, 2022
Examiner
RUSH, ERIC
Art Unit
2677
Tech Center
2600 — Communications
Assignee
Continental Autonomous Mobility Germany GmbH
OA Round
3 (Non-Final)
61%
Grant Probability
Moderate
3-4
OA Rounds
3y 5m
To Grant
97%
With Interview

Examiner Intelligence

Grants 61% of resolved cases
61%
Career Allow Rate
383 granted / 628 resolved
-1.0% vs TC avg
Strong +36% interview lift
Without
With
+36.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
32 currently pending
Career history
660
Total Applications
across all art units

Statute-Specific Performance

§101
10.8%
-29.2% vs TC avg
§103
40.0%
+0.0% vs TC avg
§102
12.7%
-27.3% vs TC avg
§112
27.7%
-12.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 628 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This action is responsive to the request for continued examination (RCE), amendments and remarks received 15 December 2025. Claims 1, 2 and 6 - 14 are currently pending. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 15 December 2025 has been entered. Claim Objections Claim 1 is objected to because of the following informalities: Line 22 of claim 1 recites, in part, “the vehicle” which appears to contain inconsistent claim terminology and/or a minor informality. The Examiner suggests amending the claim to --the towing vehicle-- in order to maintain consistency with line 2 of claim 1 and to improve the clarity and precision of the claim. Appropriate correction is required. Claim 12 is objected to because of the following informalities: Line 24 of claim 12 recites, in part, “the vehicle” which appears to contain inconsistent claim terminology and/or a minor informality. The Examiner suggests amending line 24 of claim 12 to --the towing vehicle-- in order to maintain consistency with line 2 of claim 12 and to improve the clarity and precision of the claim. Appropriate correction is required. The objection to claim 9, due to a minor informality, is hereby withdrawn in view of the amendments and remarks received 15 December 2025. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. The rejections to claims 1, 2 and 4 - 13 under 35 U.S.C. 101 are hereby withdrawn in view of the amendments and remarks received 15 December 2025. Response to Arguments Applicant's arguments filed 15 December 2025 have been fully considered but they are not persuasive. On pages 9 - 10 of the remarks the Applicant’s Representative argues that Diessner et al. fail to “disclose, teach, or suggest calculating a trailer angle without using a location of the towball of the vehicle.” The Applicant’s Representative argues that Diessner et al. teach “that it is necessary to first identify the position of the hitch ball in the images, because ‘[t]he trailer angle detection system rotates the field of view around the tip of hitch.’” Therefore, the Applicant’s Representative argues that Diessner et al. is “different from claim 1, which requires calculating first and second angle estimations without using a location of the towball of the vehicle.” The Examiner respectfully disagrees. In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., “calculating a trailer angle without using a location of the towball of the vehicle”) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). The Examiner asserts that the instant claims recite and require that “the calculating of the first and second angle estimations does not use a location of a towball of the vehicle.” Additionally, the Examiner asserts that, at least, Diessner et al. disclose “wherein the calculating of the first and second angle estimations does not use a location of a towball of the vehicle”, see at least the abstract, figures 1, 2, 4 and 6, page 1 paragraph 0005, page 2 paragraphs 0018 - 0019 and 0023 - 0024, page 3 paragraphs 0036, 0040 and 0045 and page 4 paragraphs 0047 and 0049 - 0051 of Diessner et al. wherein they disclose that that the “field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data” [0024], that two “different approaches are used in concert to calculate the trailer angle. A kinematic model… and an analysis of the movement of visual features from the video frame, relative to a reference frame, is used” [0036], that in “order to track the movement of the trailer in video stream, the trailer angle detection system uses several algorithms developed for computer vision to calculate how much angular difference there is between the current video frame and the reference frame” [0040] and that for “features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame. The absolute measured angle of the trailer is the consensus angle computed above, plus the offset angle for the current reference image” [0047]. The Examiner asserts that, as shown herein above and in the cited portions, Diessner et al. disclose that the trailer angle is determined based on calculating the angular differences between features in a current video frame and matching features in a reference frame. The Examiner asserts that the angular differences between matching features in the current and reference video frames calculated by Diessner et al., corresponding to calculated angle estimations, depend only on the movement of the matching features from the current video frame relative to the reference video frame. Furthermore, the Examiner asserts that Diessner et al. disclose that the angular differences between matching features in the current and reference video frames indicate how much angular difference there is between the current and reference video frames. Nowhere in Diessner et al. is it disclosed, suggested, or implied that their calculations of angular differences between matching features in current and reference video frames uses or requires a location of a towball of the vehicle. Therefore, the Examiner asserts that, at least, Diessner et al. disclose “wherein the calculating of the first and second angle estimations does not use a location of a towball of the vehicle.” Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1, 2, 6 and 8 - 13 are rejected under 35 U.S.C. 103 as being unpatentable over Diessner et al. U.S. Publication No. 2018/0276839 A1 in view of Haja et al. German Publication No. DE 102011113197 A1. The Examiner notes that citations to Haja et al. correspond the machine translation previously provided. - With regards to claim 1, Diessner et al. disclose a method for determining a yaw angle of a trailer with respect to a longitudinal axis of a towing vehicle, (Diessner et al., Abstract, Figs. 1 - 6, Pg. 1 ¶ 0005 and 0015 - 0017, Pg. 2 ¶ 0019, Pg. 3 ¶ 0035 - 0040, Pg. 3 ¶ 0045 - Pg. 4 ¶ 0051) the method comprising: - capturing at least a first image and a second image of the trailer using a camera, (Diessner et al., Abstract, Figs. 3 - 5, Pg. 1 ¶ 0005 and 0015, Pg. 2 ¶ 0018, Pg. 3 ¶ 0029 - 0037, Pg. 4 ¶ 0053 - 0055) an orientation of the trailer with respect to the towing vehicle being different on the at least first and second images; (Diessner et al., Abstract, Fig. 8, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0045 - Pg. 4 ¶ 0051) - determining at least a first feature and a second feature of the trailer which are visible on the first and second images, wherein the first and second features are arranged at different positions of the trailer; (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0021 and 0024 - 0027, Pg. 3 ¶ 0029 and 0040 - 0042) - calculating a first angle estimation, the first angle estimation characterizing a pivot angle in a horizontal plane between the first feature on the first image and the first feature on the second image with respect to a fix point of the towing vehicle, (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0005, Pg. 1 ¶ 0015, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data”, “matched features that are on the trailer will have similar vector angles or low angular differences in position between the current and the reference frame, while matched features not on the trailer will have random angular changes or differences or dissimilar angular differences of vectors of features over multiple frames of captured image data. For example, features determined on a trailer, as the trailer moves relative to the vehicle (such as during a turning maneuver of the vehicle and trailer) will have similar angular feature vectors (and thus the differences between the vector angles will be low and similar or non-random) in that the features move together relative to the vehicle, while features that are not indicative of features on the trailer, such as features of an object on the ground, will have dissimilar or random angular differences or changes as they move over multiple frames of captured image data, due to the non-uniform movement of the vehicle relative to the object, with the field of view of the camera changing relative to the object” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame”]) the fix point being a position of the camera; (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0015 - Pg. 2 ¶ 0020, Pg. 2 ¶ 0023 - 0024, Pg. 3 ¶ 0035 - 0037, 0040 and 0044 - 0045, Pg. 4 ¶ 0047 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data. The field of view is shown in FIG. 7, where features are detected only in the white regions, and not the black regions of the mask”, “In order to track the movement of the trailer in video stream, the trailer angle detection system uses several algorithms developed for computer vision to calculate how much angular difference there is between the current video frame and the reference frame” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame. The absolute measured angle of the trailer is the consensus angle computed above, plus the offset angle for the current reference image”]) - calculating a second angle estimation, the second angle estimation characterizing a pivot angle in a horizontal plane between the second feature on the first image and the second feature on the second image with respect to the fix point of the towing vehicle; (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0005, Pg. 1 ¶ 0015, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data”, “matched features that are on the trailer will have similar vector angles or low angular differences in position between the current and the reference frame, while matched features not on the trailer will have random angular changes or differences or dissimilar angular differences of vectors of features over multiple frames of captured image data. For example, features determined on a trailer, as the trailer moves relative to the vehicle (such as during a turning maneuver of the vehicle and trailer) will have similar angular feature vectors (and thus the differences between the vector angles will be low and similar or non-random) in that the features move together relative to the vehicle, while features that are not indicative of features on the trailer, such as features of an object on the ground, will have dissimilar or random angular differences or changes as they move over multiple frames of captured image data, due to the non-uniform movement of the vehicle relative to the object, with the field of view of the camera changing relative to the object” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame”]) and - calculating the yaw angle based on the first and second angle estimations, (Diessner et al., Abstract, Figs. 2 - 6, Pg. 1 ¶ 0005 and 0015 - 0017, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051) wherein the calculating of the first and second angle estimations comprises determining vectors between the fix point and the first and second features in the first and second images, (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0019 - 0021, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051) and wherein the calculating of the first and second angle estimations does not use a location of a towball of the vehicle. (Diessner et al., Abstract, Figs. 1, 2, 4 & 6, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0036, 0040 and 0045, Pg. 4 ¶ 0047 and 0049 - 0051) Diessner et al. fail to disclose explicitly determining optical rays between the fix point and the features, and the determining comprising using camera calibration information of the camera to transform positions of the first and second features into the optical rays. Pertaining to analogous art, Haja et al. disclose wherein the calculating of the first and second angle estimations comprises determining optical rays between the fix point and the first and second features in the first and second images, (Haja et al., Fig. 3, Pg. 1 ¶ 0001 and 0006 - 0009, Pg. 1 ¶ 0013 - Pg. 2 ¶ 0015, Pg. 2 ¶ 0024 - 0027, Pg. 3 ¶ 0031 - 0033 and 0035 - 0037) the determining comprising using camera calibration information of the camera to transform positions of the first and second features into the optical rays. (Haja et al., Fig. 3, Pg. 1 ¶ 0006 - 0009, Pg. 2 ¶ 0015 and 0024 - 0027, Pg. 3 ¶ 0029 - 0031 and 0035 - 0037) Diessner et al. and Haja et al. are combinable because they are both directed towards image processing systems and methods for determining an angle between a tow vehicle and a trailer. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Diessner et al. with the teachings of Haja et al. This modification would have been prompted in order to enhance the base device of Diessner et al. with the well-known and applicable technique Haja et al. applied to a comparable device. Determining optical rays between the fix point and the features, as taught by Haja et al., would enhance the base device of Diessner et al. by enabling positions of the fix point and the features in images to be determined in three-dimensional real-world space so as to allow for their positions, and thus angles, to be more precisely determined and thereby improving the ability of the base device of Diessner et al. to accurately and reliably determine trailer angles between tow vehicles and trailers. Furthermore, this modification would have been prompted by the teachings and suggestions of Diessner et al. to calculate the trailer position in physical space and that camera and system parameters can be utilized to estimate real-world measurements, see at least page 2 paragraphs 0020 - 0022 and page 3 paragraphs 0031 - 0034 of Diessner et al. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that optical rays between the fix point and the features would be determined in order to enable positions of the fix point and the features in images to be more precisely determined in three-dimensional real-world space so as to improve the ability of the base device of Diessner et al. to accurately and reliably determine trailer angles between tow vehicles and trailers. Therefore, it would have been obvious to combine Diessner et al. with Haja et al. to obtain the invention as specified in claim 1. - With regards to claim 2, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein on the first or second image, the yaw angle of the trailer with respect to the towing vehicle is zero or any known yaw angle which is usable as reference angle. (Diessner et al., Fig. 8, Pg. 3 ¶ 0029 - 0036, Pg. 4 ¶ 0047 - 0051) - With regards to claim 6, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein in addition to the first and second features, at least one further feature of the trailer is used for calculating the yaw angle. (Diessner et al., Abstract, Figs. 4 & 6, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0021 and 0023 - 0024, Pg. 2 ¶ 0026 - Pg. 3 ¶ 0029, Pg. 3 ¶ 0036, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0047, Pg. 4 ¶ 0051 [“The algorithm takes the set of features separately detected in the current frame and the reference frame, and finds the correspondence. This produces a set of feature references that are deemed to have matched” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame.” The Examiner asserts that one of ordinary skill in the art reading Diessner et al. would understand that Diessner et al. disclose that three or more features of the trailer may be used when calculating the trailer angle.]) - With regards to claim 8, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein the yaw angle is calculated by establishing an average value of the first and second angle estimations or by using a statistical approach applied to the first and second angle estimations. (Diessner et al., Pg. 3 ¶ 0040 - Pg. 4 ¶ 0047) - With regards to claim 9, Diessner et al. in view of Haja et al. disclose the method according to claim 1, further comprising determining an angle window, the angle window comprising an upper bound and a lower bound around a yaw angle, (Diessner et al., Figs. 7 & 8, Pg. 2 ¶ 0023 - 0024, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051) determining a set of features which lead to angle estimations within the angle window, and (Diessner et al., Figs. 7 & 8, Pg. 2 ¶ 0023 - 0024, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051) using the determined set of features for future yaw angle calculations. (Diessner et al., Figs. 7 & 8, Pg. 2 ¶ 0023 - 0027, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051) - With regards to claim 10, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein a value of a calculated yaw angle is increased by a certain portion or percentage in order to remedy underestimations. (Diessner et al., Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051 [“For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame. The absolute measured angle of the trailer is the consensus angle computed above, plus the offset angle for the current reference image.” The Examiner asserts that “in order to remedy underestimations” is an intended use/intended result limitation and that intended use/intended result limitations are not given patentable weight, see at least MPEP § 2111.02 and § 2111.04.]) - With regards to claim 11, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein the camera is a rear view camera of the towing vehicle. (Diessner et al., Abstract, Figs. 1 & 3 - 5, Pg. 1 ¶ 0015 - 0017, Pg. 2 ¶ 0022 - 0024) - With regards to claim 12, Diessner et al. disclose a system for determining a yaw angle of a trailer with respect to a longitudinal axis of a towing vehicle, (Diessner et al., Abstract, Figs. 1 - 6, Pg. 1 ¶ 0005 and 0015 - 0017, Pg. 2 ¶ 0019, Pg. 3 ¶ 0035 - 0040, Pg. 3 ¶ 0045 - Pg. 4 ¶ 0051, Pg. 4 ¶ 0053 - 0054) the system comprising a camera for capturing images of the trailer (Diessner et al., Abstract, Figs. 3 - 5, Pg. 1 ¶ 0005 and 0015, Pg. 2 ¶ 0018, Pg. 3 ¶ 0029 - 0037, Pg. 4 ¶ 0053 - 0055) and a processing entity, (Diessner et al., Abstract, Figs. 3 - 5, Pg. 1 ¶ 0015 - 0017, Pg. 2 ¶ 0023 - 0024, Pg. 4 ¶ 0053 - 0056, Pg. 5 Claim 1, Pg. 6 Claims 14 and 18) the system further being configured to execute a method comprising: - capturing at least a first image and a second image of the trailer using the camera, (Diessner et al., Abstract, Figs. 3 - 5, Pg. 1 ¶ 0005 and 0015, Pg. 2 ¶ 0018, Pg. 3 ¶ 0029 - 0037, Pg. 4 ¶ 0053 - 0055) an orientation of the trailer with respect to the towing vehicle being different on the at least first and second images; (Diessner et al., Abstract, Fig. 8, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0045 - Pg. 4 ¶ 0051) - determining at least a first feature and a second feature of the trailer which are visible on the first and second images, wherein the first and second features are arranged at different positions of the trailer; (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0021 and 0024 - 0027, Pg. 3 ¶ 0029 and 0040 - 0042) - calculating a first angle estimation, the first angle estimation characterizing a pivot angle in a horizontal plane between the first feature on the first image and the first feature on the second image with respect to a fix point of the towing vehicle, (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0005, Pg. 1 ¶ 0015, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data”, “matched features that are on the trailer will have similar vector angles or low angular differences in position between the current and the reference frame, while matched features not on the trailer will have random angular changes or differences or dissimilar angular differences of vectors of features over multiple frames of captured image data. For example, features determined on a trailer, as the trailer moves relative to the vehicle (such as during a turning maneuver of the vehicle and trailer) will have similar angular feature vectors (and thus the differences between the vector angles will be low and similar or non-random) in that the features move together relative to the vehicle, while features that are not indicative of features on the trailer, such as features of an object on the ground, will have dissimilar or random angular differences or changes as they move over multiple frames of captured image data, due to the non-uniform movement of the vehicle relative to the object, with the field of view of the camera changing relative to the object” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame”]) the fix point being a position of the camera; (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0015 - Pg. 2 ¶ 0020, Pg. 2 ¶ 0023 - 0024, Pg. 3 ¶ 0035 - 0037, 0040 and 0044 - 0045, Pg. 4 ¶ 0047 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data. The field of view is shown in FIG. 7, where features are detected only in the white regions, and not the black regions of the mask”, “In order to track the movement of the trailer in video stream, the trailer angle detection system uses several algorithms developed for computer vision to calculate how much angular difference there is between the current video frame and the reference frame” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame. The absolute measured angle of the trailer is the consensus angle computed above, plus the offset angle for the current reference image”]) - calculating a second angle estimation, the second angle estimation characterizing a pivot angle in a horizontal plane between the second feature on the first image and the second feature on the second image with respect to the fix point of the towing vehicle; (Diessner et al., Abstract, Figs. 1, 2 & 4, Pg. 1 ¶ 0005, Pg. 1 ¶ 0015, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051 [“The field of view of the rear camera is centered around the tip of the hitch. This allows the system to measure trailer angle by analyzing the movement of visual features from the video frames of captured image data”, “matched features that are on the trailer will have similar vector angles or low angular differences in position between the current and the reference frame, while matched features not on the trailer will have random angular changes or differences or dissimilar angular differences of vectors of features over multiple frames of captured image data. For example, features determined on a trailer, as the trailer moves relative to the vehicle (such as during a turning maneuver of the vehicle and trailer) will have similar angular feature vectors (and thus the differences between the vector angles will be low and similar or non-random) in that the features move together relative to the vehicle, while features that are not indicative of features on the trailer, such as features of an object on the ground, will have dissimilar or random angular differences or changes as they move over multiple frames of captured image data, due to the non-uniform movement of the vehicle relative to the object, with the field of view of the camera changing relative to the object” and “For features that survive the filtering, the mean angular difference is calculated, resulting in a consensus angle that the frame differs from the reference frame”]) and - calculating the yaw angle based on the first and second angle estimations, (Diessner et al., Abstract, Figs. 2 - 6, Pg. 1 ¶ 0005 and 0015 - 0017, Pg. 3 ¶ 0040 - Pg. 4 ¶ 0051) wherein the calculating of the first and second angle estimations comprises determining vectors between the fix point and the first and second features in the first and second images, (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0019 - 0021, Pg. 3 ¶ 0029 - 0037 and 0040, Pg. 3 ¶ 0044 - Pg. 4 ¶ 0051) and wherein the calculating of the first and second angle estimations does not use a location of a towball of the vehicle. (Diessner et al., Abstract, Figs. 1, 2, 4 & 6, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0019 and 0023 - 0024, Pg. 3 ¶ 0036, 0040 and 0045, Pg. 4 ¶ 0047 and 0049 - 0051)Diessner et al. fail to disclose explicitly determining optical rays between the fix point and the features, and the determining comprising using camera calibration information of the camera to transform positions of the first and second features into the optical rays. Pertaining to analogous art, Haja et al. disclose wherein the calculating of the first and second angle estimations comprises determining optical rays between the fix point and the first and second features in the first and second images, (Haja et al., Fig. 3, Pg. 1 ¶ 0001 and 0006 - 0009, Pg. 1 ¶ 0013 - Pg. 2 ¶ 0015, Pg. 2 ¶ 0024 - 0027, Pg. 3 ¶ 0031 - 0033 and 0035 - 0037) the determining comprising using camera calibration information of the camera to transform positions of the first and second features into the optical rays. (Haja et al., Fig. 3, Pg. 1 ¶ 0006 - 0009, Pg. 2 ¶ 0015 and 0024 - 0027, Pg. 3 ¶ 0029 - 0031 and 0035 - 0037) Diessner et al. and Haja et al. are combinable because they are both directed towards image processing systems and methods for determining an angle between a tow vehicle and a trailer. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Diessner et al. with the teachings of Haja et al. This modification would have been prompted in order to enhance the base device of Diessner et al. with the well-known and applicable technique Haja et al. applied to a comparable device. Determining optical rays between the fix point and the features, as taught by Haja et al., would enhance the base device of Diessner et al. by enabling positions of the fix point and the features in images to be determined in three-dimensional real-world space so as to allow for their positions, and thus angles, to be more precisely determined and thereby improving the ability of the base device of Diessner et al. to accurately and reliably determine trailer angles between tow vehicles and trailers. Furthermore, this modification would have been prompted by the teachings and suggestions of Diessner et al. to calculate the trailer position in physical space and that camera and system parameters can be utilized to estimate real-world measurements, see at least page 2 paragraphs 0020 - 0022 and page 3 paragraphs 0031 - 0034 of Diessner et al. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that optical rays between the fix point and the features would be determined in order to enable positions of the fix point and the features in images to be more precisely determined in three-dimensional real-world space so as to improve the ability of the base device of Diessner et al. to accurately and reliably determine trailer angles between tow vehicles and trailers. Therefore, it would have been obvious to combine Diessner et al. with Haja et al. to obtain the invention as specified in claim 12. - With regards to claim 13, Diessner et al. in view of Haja et al. disclose a system according to claim 12, and a vehicle comprising a system (Diessner et al., Abstract, Figs. 1 - 5, Pg. 1 ¶ 0005 and 0015 - 0017, Pg. 2 ¶ 0021 - 0025, Pg. 4 ¶ 0051 - 0056) according to claim 12. ([Diessner et al. in view of Haja et al. disclose a system according to claim 12, see the analysis of claim 12 provided herein above.]) Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Diessner et al. U.S. Publication No. 2018/0276839 A1 in view of Haja et al. German Publication No. DE 102011113197 A1 as applied to claim 1 above, and further in view of Singh U.S. Publication No. 2021/0064046 A1. The Examiner notes that citations to Haja et al. correspond the machine translation previously provided. - With regards to claim 7, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein the yaw angle is calculated by establishing a mean value based on the first and second angle estimations. (Diessner et al., Pg. 3 ¶ 0040 - Pg. 4 ¶ 0047) Diessner et al. fail to disclose explicitly establishing a median value. Pertaining to analogous art, Singh discloses wherein the yaw angle is calculated by establishing a median value based on the first and second angle estimations. (Singh, Figs. 3 & 4, Pg. 2 ¶ 0027 - 0028, Pg. 4 ¶ 0051 - 0053, Pg. 5 ¶ 0058) Diessner et al. in view of Haja et al. and Singh are combinable because they are all directed towards image processing systems and methods that determine an angle of trailer with respect to a vehicle. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined teachings of Diessner et al. in view of Haja et al. with the teachings of Singh. This modification would have been prompted in order to substitute the mean value of Diessner et al. for the median value of Singh. The median value of Singh could be substituted in place of the mean value of Diessner et al. utilizing well-known techniques in the art and would likely yield predictable results, in that in the combination a median value of the trailer angles that survive the filtering of Diessner et al. would be utilized as the consensus trailer angle. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that a median value of the trailer angles that survive the filtering of the combined base device would be utilized as the consensus angle of the trailer angle detected. Therefore, it would have been obvious to combine Diessner et al. in view of Haja et al. with Singh to obtain the invention as specified in claim 7. Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Diessner et al. U.S. Publication No. 2018/0276839 A1 in view of Haja et al. German Publication No. DE 102011113197 A1 as applied to claim 1 above, and further in view of Turner U.S. Publication No. 2022/0222850 A1. The Examiner notes that citations to Haja et al. correspond the machine translation previously provided. - With regards to claim 14, Diessner et al. in view of Haja et al. disclose the method according to claim 1, wherein determining the first and second features in the first and second images comprises a feature matching process, (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0019, Pg. 2 ¶ 0023 - Pg. 3 ¶ 0028, Pg. 3 ¶ 0039 - 0042) the feature matching process comprising one or more algorithms. (Diessner et al., Abstract, Fig. 4, Pg. 1 ¶ 0005, Pg. 2 ¶ 0018 - 0019, Pg. 2 ¶ 0023 - Pg. 3 ¶ 0028, Pg. 3 ¶ 0039 - 0042) Diessner et al. fail to disclose explicitly one or more algorithms selected from the group consisting of Harris Corner Detector algorithm, Scale-Invariant Feature Transform algorithm, Speeded Up Robust Features algorithm, Binary Robust Invariant Scalable Keypoints algorithm, Binary Robust Independent Elementary Features (BRIEF) algorithm, and Oriented Features from Accelerated Segment Test (FAST) and rotated BRIEF algorithm. Pertaining to analogous art, Turner discloses wherein determining the first and second features in the first and second images comprises a feature matching process, the feature matching process comprising one or more algorithms selected from the group consisting of Harris Corner Detector algorithm, Scale-Invariant Feature Transform algorithm, Speeded Up Robust Features algorithm, Binary Robust Invariant Scalable Keypoints algorithm, Binary Robust Independent Elementary Features (BRIEF) algorithm, and Oriented Features from Accelerated Segment Test (FAST) and rotated BRIEF algorithm. (Turner, Pg. 2 ¶ 0014, Pg. 3 ¶ 0040, Pg. 5 ¶ 0051) Diessner et al. in view of Haja et al. and Turner are combinable because they are all directed image processing systems and methods that determine an angle of trailer with respect to a vehicle. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combined teachings of Diessner et al. in view of Haja et al. with the teachings of Turner. This modification would have been prompted in order to substitute the undisclosed feature matching algorithm(s) of Diessner et al. for the BRIEF algorithm of Turner. The BRIEF algorithm of Turner could be substituted in place of the undisclosed feature matching algorithm(s) of Diessner et al. utilizing well-known techniques in the art and would likely yield predictable results, in that in the combination the BRIEF algorithm would be utilized as the feature matching algorithm that determines the first and second features in the first and second images. This combination could be completed according to well-known techniques in the art and would likely yield predictable results, in that the BRIEF algorithm would be utilized as the feature matching algorithm of the combined base device that determines the first and second features in the first and second images. Therefore, it would have been obvious to combine Diessner et al. in view of Haja et al. with Turner to obtain the invention as specified in claim 14. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIC RUSH whose telephone number is (571) 270-3017. The examiner can normally be reached 9am - 5pm Monday - Friday. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Andrew Bee can be reached at (571) 270 - 5183. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ERIC RUSH/Primary Examiner, Art Unit 2677
Read full office action

Prosecution Timeline

Sep 30, 2022
Application Filed
Mar 04, 2025
Non-Final Rejection — §101, §103
Jul 07, 2025
Response Filed
Oct 03, 2025
Final Rejection — §101, §103
Nov 04, 2025
Response after Non-Final Action
Dec 15, 2025
Request for Continued Examination
Dec 18, 2025
Response after Non-Final Action
Jan 10, 2026
Non-Final Rejection — §101, §103
Jan 20, 2026
Interview Requested
Jan 28, 2026
Examiner Interview Summary
Jan 28, 2026
Applicant Interview (Telephonic)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586229
COMPUTER IMPLEMENTED METHODS AND DEVICES FOR DETERMINING DIMENSIONS AND DISTANCES OF HEAD FEATURES
2y 5m to grant Granted Mar 24, 2026
Patent 12548292
METHOD AND SYSTEM FOR IDENTIFYING REFLECTIONS IN THERMAL IMAGES
2y 5m to grant Granted Feb 10, 2026
Patent 12548395
SYSTEMS, METHODS AND DEVICES FOR MONITORING BETTING ACTIVITIES
2y 5m to grant Granted Feb 10, 2026
Patent 12541856
MASKING OF OBJECTS IN AN IMAGE STREAM
2y 5m to grant Granted Feb 03, 2026
Patent 12518504
METHOD FOR CALIBRATING AN OBJECT RE-IDENTIFICATION SOLUTION IMPLEMENTING AN ARRAY OF A PLURALITY OF CAMERAS
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
61%
Grant Probability
97%
With Interview (+36.2%)
3y 5m
Median Time to Grant
High
PTA Risk
Based on 628 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month