Prosecution Insights
Last updated: April 19, 2026
Application No. 18/849,786

METHOD FOR FILTERING MEASUREMENT DATA FOR A PATH-FOLLOWING CONTROL OF AN OBJECT

Non-Final OA §101§102§103§112
Filed
Sep 23, 2024
Examiner
CAMPBELL, JOSHUA A
Art Unit
3747
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Robert Bosch GmbH
OA Round
1 (Non-Final)
54%
Grant Probability
Moderate
1-2
OA Rounds
3y 7m
To Grant
76%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
249 granted / 457 resolved
-15.5% vs TC avg
Strong +22% interview lift
Without
With
+22.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
37 currently pending
Career history
494
Total Applications
across all art units

Statute-Specific Performance

§101
2.7%
-37.3% vs TC avg
§103
54.8%
+14.8% vs TC avg
§102
21.8%
-18.2% vs TC avg
§112
19.6%
-20.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 457 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 11-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. In accordance with MPEP 2106.04, each of claims 11-19 has been analyzed to determine whether it is directed to any judicial exceptions. Step 2A, Prong 1 per MPEP 2106.04(a) Each of claims 11-19 recites at least one step or instruction for filtering coordinates of a point cloud, which is grouped as a mental process in MPEP 2106.04(a)(2)(III) or a certain method of organizing human activity in MPEP 2106.04(a)(2)(II) or mathematical concept in MPEP 2106.04(a)(2)(I). Filtering coordinates is both a mathematical concept, as evidenced by the mathematical formula for performing the filtering set forth in claim 13, and a mental process, because it is an evaluation that can be performed in the human mind or with the aid of pen and paper. Accordingly, each of claims 11-19 recites an abstract idea. Specifically, claims 11, 18 or 19 recite one or more of the following: an object; a sensor including a camera; a control unit; acquiring measurement data (observation, judgment or evaluation, which is grouped as a mental process in MPEP 2106.04(a)(2)(III)); weighting point coordinates (observation, judgment or evaluation, which is grouped as a mental process in MPEP 2106.04(a)(2)(III)); filtering point coordinates (observation, judgment or evaluation, which is grouped as a mental process in MPEP 2106.04(a)(2)(III)). Further, dependent claims 12-17 merely include limitations that either further define the abstract idea (and thus don’t make the abstract idea any less abstract) or amount to no more than generally linking the use of the abstract idea to a particular technological environment or field of use because they’re merely incidental or token additions to the claims that do not alter or affect how the claimed functions/steps are performed. Accordingly, as indicated above, each of the above-identified claims recites an abstract idea as in MPEP 2106.04(a). Step 2A, Prong 2 per MPEP 2106.04(d) The above-identified abstract idea in each of independent claims 11, 18 and 19 (and their respective dependent claims 12-17) is not integrated into a practical application under MPEP 2106.04(d) because the additional elements (identified above in independent claims 11, 18 and 19), either alone or in combination, generally link the use of the above-identified abstract idea to a particular technological environment or field of use according to MPEP 2106.05(h). More specifically, the additional elements of: a control unit are generically recited computer elements in independent claims 18 and 19 (and their respective dependent claims) which do not improve the functioning of a computer, or any other technology or technical field according to MPEP 2106.04(d)(1) and 2106.05(a). Nor do these above-identified additional elements serve to apply the above-identified abstract idea with, or by use of, a particular machine according to MPEP 2106.05(b), effect a transformation according to MPEP 2106.05(c), provide a particular treatment or prophylaxis according to MPEP 2106.04(d)(2) or apply or use the above-identified abstract idea in some other meaningful way beyond generally linking the use thereof to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception according to MPEP 2106.04(d)(2) and 2106.05(e). Furthermore, the above-identified additional elements do not add a meaningful limitation to the abstract idea because they amount to simply implementing the abstract idea on a computer in accordance with MPEP 2106.05(f). For at least these reasons, the abstract idea identified above in independent claims 11, 18 and 19 (and their respective dependent claims) is not integrated into a practical application in accordance with MPEP 2106.04(d). Moreover, the above-identified abstract idea is not integrated into a practical application in accordance with MPEP 2106.04(d) because the claimed method and system merely implements the above-identified abstract idea (e.g., mental process and certain method of organizing human activity) using rules (e.g., computer instructions) executed by a computer (e.g., control unit, as claimed). In other words, these claims are merely directed to an abstract idea with additional generic computer elements which do not add a meaningful limitation to the abstract idea because they amount to simply implementing the abstract idea on a computer according to MPEP 2106.05(f). Additionally, Applicant’s specification does not include any discussion of how the claimed invention provides a technical improvement realized by these claims over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in these claims according to MPEP 2106.05(a). That is, like Affinity Labs of Tex. v. DirecTV, LLC, the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution. Thus, for these additional reasons, the abstract idea identified above in independent claims 11, 18 and 19 (and their respective dependent claims) is not integrated into a practical application under MPEP 2106.04(d)(I). Accordingly, independent claims 11, 18 and 19 (and their respective dependent claims) are each directed to an abstract idea according to MPEP 2106.04(d). Step 2B per MPEP 2106.05 None of claims 11-19 include additional elements that are sufficient to amount to significantly more than the abstract idea in accordance with MPEP 2106.05 for at least the following reasons. These claims require the additional elements of: a control unit. The above-identified additional elements are generically claimed computer components which enable the above-identified abstract idea(s) to be conducted by performing the basic functions of automating mental tasks. The courts have recognized such computer functions as well understood, routine, and conventional functions when claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. See, MPEP 2106.05(d)(II) along with Versata Dev. Group, Inc. v. SAP Am., Inc., 793 F.3d 1306, 1334, 115 USPQ2d 1681, 1701 (Fed. Cir. 2015); and OIP Techs., 788 F.3d at 1363, 115 USPQ2d at 1092-93. Per Applicant’s specification, the control unit 130 is generically described without structure and is depicted only schematically in Figure 1 of the drawings [see specification page 10, line 28-page 11, line 8]. Accordingly, in light of Applicant’s specification, the claimed term “control unit” is reasonably construed as a generic computing device. Like SAP America vs Investpic, LLC (Federal Circuit 2018), it is clear, from the claims themselves and the specification, that these limitations require no improved computer resources, just already available computers, with their already available basic functions, to use as tools in executing the claimed process. See MPEP 2106.05(f). Furthermore, Applicant’s specification does not describe any special programming or algorithms required for the control unit. This lack of disclosure is acceptable under 35 U.S.C. §112(a) since this hardware performs non-specialized functions known by those of ordinary skill in the computer arts. By omitting any specialized programming or algorithms, Applicant's specification essentially admits that this hardware is conventional and performs well understood, routine and conventional activities in the computer industry or arts. In other words, Applicant’s specification demonstrates the well-understood, routine, conventional nature of the above-identified additional elements because it describes these additional elements in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a) (see MPEP 2106.05(d)(I)(2) and 2106.07(a)(III)). Adding hardware that performs “‘well understood, routine, conventional activit[ies]’ previously known to the industry” will not make claims patent-eligible (TLI Communications along with MPEP 2106.05(d)(I)). The recitation of the above-identified additional limitations in claims 18 and 19 amount to mere instructions to implement the abstract idea on a computer. Simply using a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general-purpose computer or computer components after the fact to an abstract idea (e.g., a fundamental economic practice or mathematical equation) does not provide significantly more. See MPEP 2106.05(f) along with Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); and TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). Moreover, implementing an abstract idea on a generic computer, does not add significantly more, similar to how the recitation of the computer in the claim in Alice amounted to mere instructions to apply the abstract idea of intermediated settlement on a generic computer. A claim that purports to improve computer capabilities or to improve an existing technology may provide significantly more. See MPEP 2106.05(a) along with McRO, Inc. v. Bandai Namco Games Am. Inc., 837 F.3d 1299, 1314-15, 120 USPQ2d 1091, 1101-02 (Fed. Cir. 2016); and Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1335-36, 118 USPQ2d 1684, 1688-89 (Fed. Cir. 2016). However, a technical explanation as to how to implement the invention should be present in the specification for any assertion that the invention improves upon conventional functioning of a computer, or upon conventional technology or technological processes. That is, per MPEP 2106.05(a), the disclosure must provide sufficient details such that one of ordinary skill in the art would recognize the claimed invention as providing an improvement. Here, Applicant’s specification does not include any discussion of how the claimed invention provides a technical improvement realized by these claims over the prior art or any explanation of a technical problem having an unconventional technical solution that is expressed in these claims. Instead, as in Affinity Labs of Tex. v. DirecTV, LLC 838 F.3d 1253, 1263-64, 120 USPQ2d 1201, 1207-08 (Fed. Cir. 2016), the specification fails to provide sufficient details regarding the manner in which the claimed invention accomplishes any technical improvement or solution. For at least the above reasons, the methods, system and apparatus of claims 11-19 are directed to applying an abstract idea as identified above on a general purpose computer without (i) improving the performance of the computer itself or providing a technical solution to a problem in a technical field according to MPEP 2106.05(a), or (ii) providing meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that these claims amount to significantly more than the abstract idea itself according to MPEP 2106.04(d)(2) and 2106.05(e). Taking the additional elements individually and in combination, the additional elements do not provide significantly more. Specifically, when viewed individually, the above-identified additional elements in independent claims 11, 18 and 19 (and their dependent claims) do not add significantly more because they are simply an attempt to limit the abstract idea to a particular technological environment according to MPEP 2106.05(h). When viewed as a combination, these above-identified additional elements simply instruct the practitioner to implement the claimed functions with well-understood, routine and conventional activity specified at a high level of generality in a particular technological environment according to MPEP 2106.05(h). When viewed as whole, the above-identified additional elements do not provide meaningful limitations to transform the abstract idea into a patent eligible application of the abstract idea such that the claims amount to significantly more than the abstract idea itself according to MPEP 2106.04(d)(2) and 2106.05(e). Moreover, neither the general computer elements nor any other additional element adds meaningful limitations to the abstract idea because these additional elements represent insignificant extra-solution activity according to MPEP 2106.05(g). As such, there is no inventive concept sufficient to transform the claimed subject matter into a patent-eligible application as required by MPEP 2106.05. Therefore, for at least the above reasons, none of the claims 11, 18 and 19 amount to significantly more than the abstract idea itself. Accordingly, claims 11-19 are not patent eligible and rejected under 35 U.S.C. 101. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 13-15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 13 defines a filtering rule xfilt i(k) according to the equation, but fails to specify how the variable “k” is initialized. For example, the filter would operate differently depending on if k is initialized at the value 0 or 1. The specification, as originally filed, also fails to define “k.” Additionally, claim 13 specifies that the filtering rule also includes “at least further parameters of target curvature and object orientation” but is silent on how these parameters are applied in the filtering rule or which variables the target curvature and object orientation correspond to. For the purposes of examination, claim 13 is being interpreted such that k is initialized at 0 or 1 and the target curvature and object orientation are achieved by applying the filtering rule. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 11, 12, 16 and 18-19 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Heenan (US Patent Application Publication 2006/0220912). Regarding claims 11 and 16, Heenan discloses a method for filtering measurement data for a path-following control of an object, wherein the method comprises the following steps: acquiring measurement data of at least one portion of a trajectory along which the object (10) is to move, wherein the measurement data include a plurality of point coordinates which are each subject to local fluctuations and thereby form a point cloud [0053: “The vehicle…comprises two sensing or image acquisition means–a video camera 13 mounted to the front of the host vehicle 10 and a LIDAR sensor 14. The camera sensor 13 produces a stream of output data, which are fed to an image processing board 15…The radar or LIDAR type sensor 14…provides object identification and allows the distance of the detected objects from the host vehicle 10 to be determined together with the bearing of the object relative to the host vehicle. The output of the LIDAR sensor 14 is also passed to an image processing board 16 and the data produced by the two image processing boards 15,16 is passed to a data processor 17 located within the vehicle which combines or fuses the image and object detection data”] [0056-0058: “The data processor performs both low level image processing and also higher level image processing functions on the data points output from the sensors…Looked at one way, the processor fits points that it believes to be part of a lane boundary to a curve, which is given by equation 1”]; weighting the plurality of point coordinates by assigning them each different weighting factors, wherein the different weighting factors each specify fluctuations of the point coordinates of the point cloud, wherein at least one point coordinate is fixed in an immediate vicinity of the object by assigning a weighting factor of smaller magnitude to the at least one point coordinate than to a point coordinate at a distance from the object [0060: “In order to fuse the data from the two sensors, a set of data points that are believed to lie on a line boundary are identified in the raw data. A weighting is then allocated to each data point indicating how reliable the data point is believed to be. This weighting is dependent upon the performance characteristics of each sensor and will be a function of range. The weighting value is varied with range depending on how likely the data sample point is likely to be defined by the limitation of the sensor within the operating environment. Hence, in the example given data points from the LIDAR data are weighted more heavily in the near range than the data points from the video data, whilst the video data is weighted more heavily in the distance”]; and performing the filtering for all point coordinates of the point cloud based on the different weighting factors for the path-following control of the object [0021: “The processing means may filter the data from the two sensing means to identify points in the image corresponding to one or more of: the right hand edge of a road, the left hand edge of the road, lane markings defining lanes in the road, the radius of curvature of the lane and or the road, and optionally the heading angle of the host vehicle relative to the road/lane. These detected points may be processed to determine the path of the lane boundaries ahead of the host vehicle”] [0023: “The processing means may be adapted to fuse the data points and weightings using on or more recursive processing techniques…The techniques that could be employed with the scope of the invention include recursive least squares (RLS) estimator or other process such as a Kalman filter which recursively produces estimates of lane boundaries taking into consideration the weightings applied to the data and optionally the confidence values. This means that the weightings are input to the filter along with the data points and influence the output of the filter”]. Regarding claim 12, Heenan further discloses wherein, as the object moves, the at least one point coordinate at a distance from the object moves in order to substantially form the at least one point coordinate in the immediate vicinity of the object after the object has traversed the at least one portion of the trajectory [0005, 0007, the measured data is acquired at new point coordinates ahead of the vehicle as the vehicle moves so that the lane boundaries ahead of the vehicle may be detected]. Regarding claim 18, Heenan discloses a system configured to filter measurement data for a path-following control of an object, comprising: at least one sensor including a camera (13), configured to acquire the measurement data of at least one portion of a trajectory along which the object (10) is to move, wherein the measurement data include a plurality of point coordinates which are each subject to local fluctuations and thereby form a point cloud [0053: “The vehicle…comprises two sensing or image acquisition means–a video camera 13 mounted to the front of the host vehicle 10 and a LIDAR sensor 14. The camera sensor 13 produces a stream of output data, which are fed to an image processing board 15…The radar or LIDAR type sensor 14…provides object identification and allows the distance of the detected objects from the host vehicle 10 to be determined together with the bearing of the object relative to the host vehicle. The output of the LIDAR sensor 14 is also passed to an image processing board 16 and the data produced by the two image processing boards 15,16 is passed to a data processor 17 located within the vehicle which combines or fuses the image and object detection data”] [0056-0058: “The data processor performs both low level image processing and also higher level image processing functions on the data points output from the sensors…Looked at one way, the processor fits points that it believes to be part of a lane boundary to a curve, which is given by equation 1”]; a control unit (17) communicatively connected to the at least one sensor and configured to: weight the plurality of point coordinates by assigning them each different weighting factors, wherein the different weighting factors each specify fluctuations of the point coordinates of the point cloud, wherein at least one point coordinate is fixed in an immediate vicinity of the object by assigning a weighting factor of smaller magnitude to the at least one point coordinate than to a point coordinate at a distance from the object [0060: “In order to fuse the data from the two sensors, a set of data points that are believed to lie on a line boundary are identified in the raw data. A weighting is then allocated to each data point indicating how reliable the data point is believed to be. This weighting is dependent upon the performance characteristics of each sensor and will be a function of range. The weighting value is varied with range depending on how likely the data sample point is likely to be defined by the limitation of the sensor within the operating environment. Hence, in the example given data points from the LIDAR data are weighted more heavily in the near range than the data points from the video data, whilst the video data is weighted more heavily in the distance”]; and perform the filtering for all point coordinates of the point cloud based on the different weighting factors for the path-following control of the object [0021: “The processing means may filter the data from the two sensing means to identify points in the image corresponding to one or more of: the right hand edge of a road, the left hand edge of the road, lane markings defining lanes in the road, the radius of curvature of the lane and or the road, and optionally the heading angle of the host vehicle relative to the road/lane. These detected points may be processed to determine the path of the lane boundaries ahead of the host vehicle”] [0023: “The processing means may be adapted to fuse the data points and weightings using on or more recursive processing techniques…The techniques that could be employed with the scope of the invention include recursive least squares (RLS) estimator or other process such as a Kalman filter which recursively produces estimates of lane boundaries taking into consideration the weightings applied to the data and optionally the confidence values. This means that the weightings are input to the filter along with the data points and influence the output of the filter”]. Regarding claim 19, Heenan discloses a control unit to filter measurement data for a path-following control of an object, wherein the control unit is configured to: acquire measurement data of at least one portion of a trajectory along which the object (10) is to move, wherein the measurement data include a plurality of point coordinates which are each subject to local fluctuations and thereby form a point cloud [0053: “The vehicle…comprises two sensing or image acquisition means–a video camera 13 mounted to the front of the host vehicle 10 and a LIDAR sensor 14. The camera sensor 13 produces a stream of output data, which are fed to an image processing board 15…The radar or LIDAR type sensor 14…provides object identification and allows the distance of the detected objects from the host vehicle 10 to be determined together with the bearing of the object relative to the host vehicle. The output of the LIDAR sensor 14 is also passed to an image processing board 16 and the data produced by the two image processing boards 15,16 is passed to a data processor 17 located within the vehicle which combines or fuses the image and object detection data”] [0056-0058: “The data processor performs both low level image processing and also higher level image processing functions on the data points output from the sensors…Looked at one way, the processor fits points that it believes to be part of a lane boundary to a curve, which is given by equation 1”]; weight the plurality of point coordinates by assigning them each different weighting factors, wherein the different weighting factors each specify fluctuations of the point coordinates of the point cloud, wherein at least one point coordinate is fixed in an immediate vicinity of the object by assigning a weighting factor of smaller magnitude to the at least one point coordinate than to a point coordinate at a distance from the object [0060: “In order to fuse the data from the two sensors, a set of data points that are believed to lie on a line boundary are identified in the raw data. A weighting is then allocated to each data point indicating how reliable the data point is believed to be. This weighting is dependent upon the performance characteristics of each sensor and will be a function of range. The weighting value is varied with range depending on how likely the data sample point is likely to be defined by the limitation of the sensor within the operating environment. Hence, in the example given data points from the LIDAR data are weighted more heavily in the near range than the data points from the video data, whilst the video data is weighted more heavily in the distance”]; and perform the filtering for all point coordinates of the point cloud based on the different weighting factors for the path-following control of the object [0021: “The processing means may filter the data from the two sensing means to identify points in the image corresponding to one or more of: the right hand edge of a road, the left hand edge of the road, lane markings defining lanes in the road, the radius of curvature of the lane and or the road, and optionally the heading angle of the host vehicle relative to the road/lane. These detected points may be processed to determine the path of the lane boundaries ahead of the host vehicle”] [0023: “The processing means may be adapted to fuse the data points and weightings using on or more recursive processing techniques…The techniques that could be employed with the scope of the invention include recursive least squares (RLS) estimator or other process such as a Kalman filter which recursively produces estimates of lane boundaries taking into consideration the weightings applied to the data and optionally the confidence values. This means that the weightings are input to the filter along with the data points and influence the output of the filter”]. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 13-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heenan (US Patent Application Publication 2006/0220912) in view of Florent (US Patent Number 6,204,891). Regarding claims 13-14, Heenan discloses the method of claim 11 as discussed above but does not disclose wherein the filtering is performed based on the different weighting factors for all point coordinates of the point cloud according to the following filtering rule xfilt i(k) x f i l t   i ( k ) = F n i ∙ x i ( k ) + F 0 ∙ x f i l t   i ( k - 1 )   F n i + F 0 where x denotes the point coordinates to be filtered, each including an x coordinate and a y coordinate, and also included at least further parameters of target curvature and object orientation, where i corresponds to an index of the point coordinates and Fni cooresponds to an assigned weighting factor, where F0 specifies the weighting factor for a previous cycle in performing the filtering. Florent discloses a method for filtering point coordinates included in measured data of a point cloud according to the filtering rule recited in claim 13, the filtering rule being implemented using a PT1 filter (Col. 3, lines 1-10, images acquired from camera tube 4 include a plurality of point coordinates at each pixel which are subject to local fluctuations in intensity level) (Col. 3, lines 21-27, Equation 1 shows a PT1 filter that applies weighting factors W1 and W2 to all coordinates of the point cloud). Florent teaches that the filtering operation eliminates noise in acquired images due to spatial and temporal phenomena without degrading image detail (Col. 1, lines 13-18; Col. 3, lines 39-51). Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to apply the filtering rule disclosed by Florent to the images acquired as measurement data disclosed by Heenan remove noise from the images without degrading image detail to improve the accuracy of a path-following operation relying on the images. Claim(s) 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heenan (US Patent Application Publication 2006/0220912) in view of Swaminathan (US Patent Application Publication 2020/0326420). Regarding claim 17, Heenan discloses the method of claim 11 as discussed above but does not disclose wherein the measurement data include 12 point coordinates, which each substantially have a distance of 30 cm. Swaminathan discloses acquiring measurement data having a set number of point coordinates at a set distance. Specifically, Swaminathan discloses acquiring images from a vehicle camera, and teaches that the image dimensions are defined by the number of pixels, each pixel occupying a point coordinate [0071]. Swaminathan also discloses determining the resolutions of the acquired images, where the resolution is a distance between pixels and a property of the camera [0064]. Swaminathan teaches that together the number of pixels and the distance between them affect whether the camera can distinguish between features in the environment ahead of the vehicle such as two or more road markings [0064, 0077]. Thus, the number of point coordinates and the distance between them are result-effective variables. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to include the necessary number of point coordinates and the distance between them included in the measurement data disclosed by Heenan that would allow distinguishment between features ahead of the vehicle such as lane markings since it has been held that discovering the optimum value of a result effective variable in volves only routine experimentation and would be within the level of one skilled in the art. See MPEP 2144.05. Claim(s) 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Heenan (US Patent Application Publication 2006/0220912) in view of Breed (WO 00/54008 A1). Regarding claim 20, Heenan discloses a method for path-following control of an object including at least one sensor, at least one actuator, and at least one control unit, wherein the object is a vehicle (10), the method comprising the following steps: acquiring, by the at least one sensor (13, 14), measurement data of at least one portion of a trajectory along which the vehicle is to move, wherein the measurement data include a plurality of point coordinates which are each subject to local fluctuations and thereby form a point cloud [0053: “The vehicle…comprises two sensing or image acquisition means–a video camera 13 mounted to the front of the host vehicle 10 and a LIDAR sensor 14. The camera sensor 13 produces a stream of output data, which are fed to an image processing board 15…The radar or LIDAR type sensor 14…provides object identification and allows the distance of the detected objects from the host vehicle 10 to be determined together with the bearing of the object relative to the host vehicle. The output of the LIDAR sensor 14 is also passed to an image processing board 16 and the data produced by the two image processing boards 15,16 is passed to a data processor 17 located within the vehicle which combines or fuses the image and object detection data”] [0056-0058: “The data processor performs both low level image processing and also higher level image processing functions on the data points output from the sensors…Looked at one way, the processor fits points that it believes to be part of a lane boundary to a curve, which is given by equation 1”]; processing, by the at least one control unit (17), the measurement data into actuator data, wherein, when processing the measurement data into actuator data, wherein, for filtering the measurement data [0054: “The fusion ensures that the data from one sensor can take preference over data from the other, or be given more significance than the other-according to the performance characteristics of the sensors and the range at which the data is collected”], the following steps are performed to reduce the local fluctuations of the point coordinates: weighting the plurality of point coordinates by assigning them each different weighting factors, wherein the different weighting factors each specify fluctuations of the point coordinates of the point cloud, wherein at least one point coordinate is fixed in an immediate vicinity of the object by assigning a weighting factor of smaller magnitude to the at least one point coordinate than to a point coordinate at a distance from the object [0060: “In order to fuse the data from the two sensors, a set of data points that are believed to lie on a line boundary are identified in the raw data. A weighting is then allocated to each data point indicating how reliable the data point is believed to be. This weighting is dependent upon the performance characteristics of each sensor and will be a function of range. The weighting value is varied with range depending on how likely the data sample point is likely to be defined by the limitation of the sensor within the operating environment. Hence, in the example given data points from the LIDAR data are weighted more heavily in the near range than the data points from the video data, whilst the video data is weighted more heavily in the distance”]; and performing the filtering for all point coordinates of the point cloud based on the different weighting factors for the path-following control of the object [0021: “The processing means may filter the data from the two sensing means to identify points in the image corresponding to one or more of: the right hand edge of a road, the left hand edge of the road, lane markings defining lanes in the road, the radius of curvature of the lane and or the road, and optionally the heading angle of the host vehicle relative to the road/lane. These detected points may be processed to determine the path of the lane boundaries ahead of the host vehicle”] [0023: “The processing means may be adapted to fuse the data points and weightings using on or more recursive processing techniques…The techniques that could be employed with the scope of the invention include recursive least squares (RLS) estimator or other process such as a Kalman filter which recursively produces estimates of lane boundaries taking into consideration the weightings applied to the data and optionally the confidence values. This means that the weightings are input to the filter along with the data points and influence the output of the filter”]. Heenan does not disclose the fused actuator data being used by the control unit to control at least one actuator to carry out the path-following control of the vehicle. Breed discloses a method for path-following control of an object including at least one sensor, at least one actuator, and at least one control unit (48), wherein the object is a vehicle, the method comprising the step of controlling by at least one control unit, the at least one actuator based on generated actuator data to carry out the path-following control of the vehicle [page 19, lines 13-24: “One way to imagine the system operation is to consider each car and roadway edge to behave as if it had a surrounding ‘force field’ that would prevent it from crashing into another vehicle or an obstacle along the roadway. A vehicle operator would be prevented from causing his or her vehicle to leave its assigned corridor. This is accomplished with a control system that controls the steering, acceleration and perhaps the vehicle brakes based on its knowledge of the location of the vehicle, highway boundaries and other nearby vehicles. In a preferred implementation, the location of the vehicle is determined by first using the GPS L1 signal to determine its location within approximately 100 meters. Then…the use of a MIR or similar system periodically permits the vehicle to determine its exact location and thereby determine the GPS corrections, eliminate the carrier cycle ambiguity and set the INS system”]. As discussed above, Breed teaches that controlling a vehicle steering actuator based on actuator data derived from measured data prevents the vehicle from colliding with another vehicle or obstacle along the roadway as it travels by maintaining the vehicle within its assigned corridor [page 19, lines 13-24]. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to use the at least one control unit disclosed by Heenan to control a steering actuator for path-following control as disclosed by Breed to prevent the vehicle from colliding with other vehicles or obstacles during vehicle motion. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSHUA CAMPBELL whose telephone number is (571) 272-8215. The examiner can normally be reached on Monday - Friday 9:00 AM – 5:00 PM. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lindsay M. Low can be reached on (571) 272-1196. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair- direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JOSHUA CAMPBELL/ Examiner, Art Unit 3747 /LOGAN M KRAFT/ Supervisory Patent Examiner, Art Unit 3747
Read full office action

Prosecution Timeline

Sep 23, 2024
Application Filed
Mar 01, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600451
MARINE PROPULSION SYSTEM, OUTBOARD MOTOR, AND MARINE VESSEL
2y 5m to grant Granted Apr 14, 2026
Patent 12587133
CONTROL DEVICE FOR VEHICLE
2y 5m to grant Granted Mar 24, 2026
Patent 12565181
SYSTEMS AND METHODS FOR ADJUSTING A TRAILER BRAKE GAIN TO OPTIMIZE VEHICLE EFFICIENCY WHEN TOWING
2y 5m to grant Granted Mar 03, 2026
Patent 12552455
SYSTEMS AND METHODS FOR CALIBRATING A VEHICLE STEERING ANGLE BY LEVERAGING A STEERING PINION ANGLE OFFSET AND A WHEEL ALIGNMENT FUNNEL
2y 5m to grant Granted Feb 17, 2026
Patent 12545069
HYDRAULIC LIFT FOR STAND-ALONE COIL SPRING
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
54%
Grant Probability
76%
With Interview (+22.0%)
3y 7m
Median Time to Grant
Low
PTA Risk
Based on 457 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month