+Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
The current application relates to a foreign application priority data DE102022204089.9 filed on April 27, 2022.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/07/2026 has been entered.
Response to Arguments
Applicant's amendment/arguments filed 12/11/2025 have been fully considered but are not persuasive to overcome the rejection. Examiner responds to the Applicant’s argument as the following reasons:
With regard to Claimed Rejection Under 101: Applicant amended the independent claims 1 & 12-13 with “wherein a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images” which does not overcome the 101 rejection because the “determination of whether the ego vehicle will leave ..” represents a mental process which directs to an abstract idea. See an updated 101 Claim Rejection below.
With regard to Claimed Rejection Under 103: The amendment do not overcome the current claim rejection under 103 because the recited reference of Weibwange discloses at [0060]-[0061] that in order for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The behavior in the lateral direction is determined whether the vehicle drives straight, changes the lane (left or right), determines times for lane-change start (lateral motion relative to the lane-boundary starts), passing reference point during the lane-change (lane-boundary), lane change end (lateral motion relative to the lane-boundary ends) based on the information received from sensor system 11, see [0060]-[0061]+) & Fig.2 which meets the scope of the claimed amendment.
Therefore, the rejection still remains. See below.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-13 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The independent claims shown below:
Claim 1. A method for detecting whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the method comprising the following steps:
generating, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle;
projecting an expected trajectory of the ego vehicle into the image;
detecting at least one traffic lane boundary laterally adjacent to the trajectory; and
making a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary,
wherein a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images
Claim 12. A control device for an ego vehicle, configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the control device configured to:
generate, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle;
project an expected trajectory of the ego vehicle into the image;
detect at least one traffic lane boundary laterally adjacent to the trajectory; and
make a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary,
wherein a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images
Claim 13. An ego vehicle, comprising: an image sensor; and a control device configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the control device configured to:
generate, using the image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle;
project an expected trajectory of the ego vehicle into the image;
detect at least one traffic lane boundary laterally adjacent to the trajectory; and
make a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary;
wherein the image sensor is connected to the control device in a data-transmitting manner, for the acquisition and transmission of image data pertaining to the vehicle area in front of the ego vehicle to the control device.
wherein a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images
101 Analysis - Step 1: Statutory category – Yes
The independent claims above recite a method/control device/an ego-vehicle including several processing functions. The claim falls within one of the four statutory categories. MPEP 2106.03
101 Analysis - Step 2A Prong one evaluation: Judicial Exception – Yes – Mental processes.
In Step 2A, Prong one of the 2019 Patent Eligibility Guidance (PEG), a claim is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the limitations can be “performed in the human mind, or by a human using a pen and paper”. See MPEP 2106.04(a)(2)(III)
The claim recites the limitations of “generate, using the image sensor …”; “project an expected trajectory …”; “detect at least one traffic lane …”; “making decision …”; and “a determination of whether the ego vehicle will leave …”. Those limitations, as drafted, are simple processes that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of by a “control device” (in claim 12 & claim 13). That is, other than reciting the “control device” nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the “control device” language, the claim encompasses a person looking at data collected, and making a simple decision. The mere nominal recitation of by the control device does not take the claim limitations out of the mental process grouping.
Thus, the claim recites a mental process.
101 Analysis - Step 2A Prong two evaluation: Practical Application – No
In Step 2A, Prong two of the 2019 PEG, a claim is to be evaluated whether, as a whole, it integrates the recited judicial exception into a practical application. As noted in MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The courts have indicated that additional elements such as: merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
The Office submits that the foregoing bolded limitation(s) recite additional elements that do not integrate the recited judicial exception into a practical application.
The claims recite additional elements with functions of, e.g., claim 1 “generating, using an image sensor …”; e.g., claim 12, “a control device for an ego vehicle …”; e.g., claim 13, “an ego vehicle comprising an image sensor, and a control device …”
Wherein, the generating step from the image sensor is recited at a high level of generality (i.e. as a general means of gathering road condition data for use in the detecting and making a decision steps), amount to mere data gathering, which is a form of insignificant extra-solution activity.
Wherein, the “control device”; and “ego vehicle” merely describe how to generally “apply” the mental processing above using a generic or general-purpose controlling a vehicle, i.e. which are generic computer. The control device of an ego vehicle is recited at a high level of generality and is merely automates the “making decision” step.
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
101 Analysis - Step 2B evaluation: Inventive concept – No
In Step 2B of the 2019 PEG, a claim is to be evaluated as to whether the claim, as a whole, amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim. See MPEP 2106.05.
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, “generating …”; “detecting …”; “make a decision …”; and “a determination of whether the ego vehicle will leave …” steps were considered to be insignificant extra-solution activity in Step 2A, and thus they are re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The background recites conventional ego vehicle including an environment sensor installed in the vehicle, and the specification does not provide any indication that the vehicle’s control device is anything other than a conventional computer within a vehicle. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here).
Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, “generating …”; “detecting …”, “make a decision …”; and “a determination of whether the ego vehicle will leave …” are well understood, routine, and conventional functions. Accordingly, a conclusion that the functions are well-understood, routine, conventional activity, supported under Berkheimer.
Thus, the claims are ineligible.
Dependent Claims
Dependent claims(s) 2-11 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of the dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-11 are not patent eligible under the same rationale as provided for in the rejection of claim 1, 12-13.
Therefore, claim(s) 1-13 is/are ineligible under 35 USC §101.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-9 & 12-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Weibwange (20190377352) in view of Maass (20150194055).
With regard to claim 1, Weibwange discloses a method for detecting whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane, the method comprising the following steps:
generating, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle (an ego-vehicle includes sensor such as a camera, LIDAR, RADAR providing sensed environment which forwarded to a processor 12 which generate a current situation of vehicle’s environment, see [0052]-[0053]+. On skill of art would know that LiDAR sensor can be used to generate image data, and LiDAR integrated with camera to create more complete 3D visualizations (based on Google));
projecting an expected trajectory of the ego vehicle into the image (based on the current situation and taking into consideration the potential ego-vehicle behaviors conditional prediction for the other traff3ic is processed, a set of predicted future behaviors for the other vehicles, see [0055]-[[0056]+ and based on the future situation currently considered the parameters of the ego-vehicle trajectory are optimized taking into account, and after having optimized the trajectories of the vehicles for each of the situations, generate the ego-vehicle trajectory into account, see [0070]-[0080]+); and
making a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The trajectory represents the expected behavior in both lateral as
well is longitudinal direction, see [0060]. For controlling an ego-vehicle in highway situations, a decision to control the ego vehicle based on vehicles’ surrounding behavior, and the current layout of the traffic situation such as vehicles’ positions, and lane layout, see [0049]+. Based on the ego-vehicle behavior in specific future situations to control the ego-vehicle which is times for lane-change or remain on the current lane, see [0065]-[0076]+).
Wherein, a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The behavior in the lateral direction is determined whether the vehicle drives straight, changes the lane (left or right), determines times for lane-change start (lateral motion relative to the lane-boundary starts), passing reference point during the lane-change (lane-boundary) , lane change end (lateral motion relative to the lane-boundary ends), see [0060]-[0061]+).
Weibwange is not clearly teaching detecting the traffic lane boundary laterally adjacent to the trajectory.
Maass discloses an ego-vehicle includes an assistance system which detects the traffic lane boundary laterally adjacent to the trajectory (see [0048]-[0049]+).
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify Weibwange by including detecting the traffic lane boundary laterally adjacent to the trajectory as taught by Maass for improving the ego-vehicle performance.
With regard to claim 2, Weibwange teaches that the method as recited in claim 1, wherein an image sequence of multiple temporally consecutive images is examined (the number of possible situations are considered, see [0055]+).
With regard to claim 3, Weibwange teaches that the method as recited in claim 1, wherein the trajectory is determined from a proper movement of the ego vehicle (optimal trajectory for the ego-vehicle is obtained, see [0040]+).
With regard to claim 4, Maass teaches that the method as recited in claim 1, wherein the trajectory includes a left and a right vehicle boundary of the ego vehicle (the trajectory defined lateral offset over time, see [0049]+].
With regard to claim 5, Maass teaches that the method as recited in claim 1, wherein the comparison includes a determination of a distance of the trajectory, including of the left and/or a right vehicle boundary, from at least one detected traffic lane boundary (see [0021]-[0022]+).
With regard to claim 6, Maass teaches that the method as recited in claim 5, wherein: a first and/or second and/or third and/or fourth distance is used in the distance determination of the distance of the trajectory from the at least one traffic lane boundary, wherein: the first distance is measured from the left vehicle boundary to a next traffic lane boundary situated to the left of the left vehicle boundary, the second distance is measured from the right vehicle boundary to a next traffic lane boundary situated to the right of the right vehicle boundary, the third distance is measured from the left vehicle boundary to the next traffic lane boundary situated to the right of the left vehicle boundary, the fourth distance is measured from the right vehicle boundary to the next traffic lane boundary situated to the left of the right vehicle boundary (see [0048]+).
With regard to claim 7, Weibwange teaches that the method as recited in claim 5, wherein a decision that the ego vehicle will leave the currently traveled traffic lane is made when at least one distance changes over time (see [0029]+).
With regard to claim 8, Weibwange teaches that the method as recited in claim 5, wherein a decision that the ego vehicle will stay in the current traffic lane is made when at least one distance remains constant over time (keep a safety distance, see [0029]+).
With regard to claim 9, Weibwange teaches that the method as recited in claim 6, wherein the determination of the distances takes place in a predetermined image line of the recorded image (The vehicle 10 includes a memory 13 which stored information data, condition from sensor system, see [0054]+. The condition information is used to determine distance relative to the ego-vehicle or time to collision, and etc., see [0029]+).
With regard to claim 12, Weibwange discloses a control device for an ego vehicle, configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane (an ego-vehicle 10 includes a processor 12, see Fig. 2), the control device configured to:
generating, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle (an ego-vehicle includes sensor such as a camera, LIDAR, RADAR providing sensed environment which forwarded to a processor 12 which generate a current situation of vehicle’s environment, see [0052]-[0053]+. On skill of art would know that LiDAR sensor can be used to generate image data, and LiDAR integrated with camera to create more complete 3D visualizations (based on Google));
projecting an expected trajectory of the ego vehicle into the image (based on the current situation and taking into consideration the potential ego-vehicle behaviors conditional prediction for the other traffic is processed, a set of predicted future behaviors for the other vehicles, see [0055]-[[0056]+ and generate the ego-vehicle trajectory into account, see [0070]+); and
making a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The trajectory represents the expected behavior in both lateral as well is longitudinal direction. Based on the ego-vehicle behavior in specific future situations to control the ego-vehicle which is times for lane-change or remain on the current lane, see [0065]-[0076]+).
Wherein, a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The behavior in the lateral direction is determined whether the vehicle drives straight, changes the lane (left or right), determines times for lane-change start (lateral motion relative to the lane-boundary starts), passing reference point during the lane-change (lane-boundary) , lane change end (lateral motion relative to the lane-boundary ends), see [0060]-[0061]+).
Weibwange is not clearly teaching detecting the traffic lane boundary laterally adjacent to the trajectory.
Maass discloses an ego-vehicle includes an assistance system which detects the traffic lane boundary laterally adjacent to the trajectory (see [0048]-[0049]+).
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify Weibwange by including detecting the traffic lane boundary laterally adjacent to the trajectory as taught by Maass for improving the ego-vehicle performance.
With regard to claim 13, Weibwange discloses an ego-vehicle, comprising: an image sensor; and a control device configured to detect whether an ego vehicle will leave a currently traveled traffic lane of a roadway to the left or right or whether it will stay in the currently traveled traffic lane (an ego-vehicle 10 includes a system 11, a processor 12, and etc., see Fig.2), the control device configured to:
generating, using an image sensor, an image of a measuring space, which includes a vehicle area in front of the ego vehicle (an ego-vehicle includes sensor such as a camera, LIDAR, RADAR providing sensed environment which forwarded to a processor 12 which generate a current situation of vehicle’s environment, see [0052]-[0053]+. On skill of art would know that LiDAR sensor can be used to generate image data, and LiDAR integrated with camera to create more complete 3D visualizations (based on Google));
projecting an expected trajectory of the ego vehicle into the image (based on the current situation and taking into consideration the potential ego-vehicle behaviors conditional prediction for the other traffic is processed, a set of predicted future behaviors for the other vehicles, see [0055]-[[0056]+ and generate the ego-vehicle trajectory into account, see [0070]+); and
making a decision whether the traffic lane will be changed or maintained by comparing the trajectory to the at least one detected traffic lane boundary (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The trajectory represents the expected behavior in both lateral as well is longitudinal direction. Based on the ego-vehicle behavior in specific future situations to control the ego-vehicle which is times for lane-change or remain on the current lane, see [0065]-[0076]+).
Wherein, a determination of whether the ego vehicle will leave the currently traveled traffic lane or stay in the currently traveled traffic lane is made based on a temporal change of at least one distance between the trajectory and the detected traffic lane boundary over multiple consecutive images (for selecting a trajectory and respective behavior of the ego-vehicle an optimization is performed. The behavior in the lateral direction is determined whether the vehicle drives straight, changes the lane (left or right), determines times for lane-change start (lateral motion relative to the lane-boundary starts), passing reference point during the lane-change (lane-boundary) , lane change end (lateral motion relative to the lane-boundary ends), see [0060]-[0061]+).
Weibwange is not clearly teaching detecting the traffic lane boundary laterally adjacent to the trajectory.
Maass discloses an ego-vehicle includes an assistance system which detects the traffic lane boundary laterally adjacent to the trajectory (see [0048]-[0049]+).
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify Weibwange by including detecting the traffic lane boundary laterally adjacent to the trajectory as taught by Maass for improving the ego-vehicle performance.
Claim(s) 10-11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Weibwange and Maass as applied to claim 1 above, and further in view of Patzwaldt (20210312203).
With regard to claim 10, Weibwange and Maass disclose the claimed subject matter but fail to teach that the determination of the trajectory is additionally implemented using map data from a navigation system available in the ego vehicle.
Patzwaldt discloses an autonomous vehicle which determines one or more path planning using map data from a navigation system (see [0061]-[0063]+ & [0081]-[0082]+).
It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to modify Weibwange by including detecting the traffic lane boundary laterally adjacent to the trajectory as taught by Maass, and further including using map data from a navigation system for determining the trajectory as taught by Patzwaldt for generating the eog-vehicle’s trajectory more accuracy.
With regard to claim 11, Patwaldt teaches that the method as recited in claim 1, wherein the detection of the at least one traffic lane boundary takes place using an image analysis using a neural network (the projected sensor data applied to a neural network which identify areas of interests, see [0027]+).
Prior Arts Cited
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Kabushiki (20220057992) discloses an information processing system for
predicting with higher stability using aerial photographic images, see the summary section.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to NGA X NGUYEN whose telephone number is (571)272-5217. The examiner can normally be reached M-F 5:30AM - 2:30PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JELANI SMITH can be reached at 571-270-3969. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
NGA X. NGUYEN
Examiner
Art Unit 3662
/NGA X NGUYEN/Primary Examiner, Art Unit 3662