Prosecution Insights
Last updated: April 19, 2026
Application No. 18/767,229

METHOD AND DEVICE FOR CREATING A DIGITAL MAP

Final Rejection §103§112
Filed
Jul 09, 2024
Examiner
ARTIMEZ, DANA FERREN
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Robert Bosch GmbH
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
46 granted / 80 resolved
+5.5% vs TC avg
Strong +44% interview lift
Without
With
+43.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
42 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
19.0%
-21.0% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 80 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art. Status of the Claims This is a Final Office Action in response to Applicant’s amendment of 02 January 2026. Claims 1-7 are pending and have been considered as follows. Response to Amendment and/or Argument Applicant’s amendments and/or arguments with respect to the Claim Rejections of Claims 1-5 under 35 U.S.C. 101 as set forth in the office action 01 October 2025 have been considered and are persuasive. Therefore, Claim Rejections of Claims 1-5 under 35 U.S.C. 101 as set forth in the office action 01 October 2025 have been withdrawn. Applicant’s arguments with respect to claim(s) 1 and 4-6 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant’s amendments and/or arguments with respect to the Claim Rejection of Claims 1-6 under 35 U.S.C 112(a) as set forth in the office action of 01 October 2025 have been considered and are NOT persuasive. Specifically, Applicant argues (Pages 6-7 of Applicant’s Remarks dated 02 January 2026): PNG media_image1.png 779 754 media_image1.png Greyscale PNG media_image2.png 342 773 media_image2.png Greyscale The Examiner’s Response: The examiner has carefully considered Applicant’s arguments and respectfully disagrees for the following reasons: Regarding Argument (I.): Applicant assets that the claims recite physical and sensor-related criteria for identifying areas suitable for automated driving, however, the original specification does not describe that “intrinsic localization is possibly only when lane boundaries, curb structures, vertical roadside elements or similar high-confidence environmental features fall within the detection regions of the vehicle’s sensors”. Merely labeling the claimed features as a minimum set of sensor-detectable reference structures sufficient to enable “lane-accurate intrinsic relative localization” does not establish possession, as the specification fails to provide concrete examples, thresholds or operative conditions that define how an area qualifies. Accordingly, Applicant’s arguments rely on technical details that were not disclosed in the originally filed specification rather than on features actually supported in the original disclosure. Regarding Argument (II.): Regarding the creation of the digital map by supplementing the base map with the identified areas, the Examiner notes that the specification may describe layering map data, but it does not disclose how the “identified areas” are determined based on the sufficiency of sensor-detectable features. Therefore, while the claim language describes supplementing the map with area-specific annotations, this limitation depends on the same undisclosed criteria for identifying the area (see response to argument (I.)) As a result, the amendments and arguments do not overcome the written description deficiency and 35 U.S.C. 112(a) rejection is maintained. Applicant’s amendments and/or arguments with respect to the Claim Rejection of Claims 1-6 under 35 U.S.C 112(b) as set forth in the office action of 01 October 2025 have been considered and are NOT persuasive. Specifically, Applicant argues (Pages 7-8 of Applicant’s Remarks dated 02 January 2026): PNG media_image3.png 494 763 media_image3.png Greyscale The Examiner’s Response: The examiner has carefully considered Applicant’s arguments and Applicant’s argument is NOT persuasive. Although the amended claim removes the phrase “areas that make automated driving possible”, it continues to define the identified areas in terms of functional results and terms of degree that lacks object metes and bounds. The limitation “minimum set of sensor-detectable reference structures” remains indefinite, as the claim provides no baseline or measurable criteria for what constitute a minimum set, no specification of relevant sensor capabilities or operating conditions, and no structural definition of the referenced structures. Similarly, “lane-accurate intrinsic relative localization” is a term of degree without any expressed accuracy threshold or defined reference frame. The term “absence of externally provided absolute localization data” is indefinite because it is unclear what sources qualify as external or absolute and whether partial, intermittent, or indirect localization inputs are excluded. Accordingly, the amended limitation does not provide clear and objective boundaries and the 35 USC 112(b) rejection is maintained. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-7 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding Claim 1 (similarly independent claims 4-6), Applicant has apparently not described, in the specification, in sufficient details, by what algorithm(s) or by what steps/procedure, particularly the recited limitation “identifying, areas as being designated for automated driving…providing a minimum set of sensor-detectable reference structures that enable lane-accurate intrinsic relative localization of the automated vehicle in absence of externally provided absolute localization data”. While the claim defines the desired outcome of identifying suitable areas for automated driving, the specification ([0015-0019, 0034-0035]) provides a general description that such areas may include a portion of the traffic route and lateral strips, with width based on sensor configurations and surroundings and further notes that these areas allow the vehicle’s sensor system to reliably detect lanes, and that different sensor types affect what areas can be detected but the specification does not disclose concrete criteria, threshold, or operative procedures by which such areas are determined. The specification further fails to describe, at the time of filing, how to determine a “minimum set” of reference structures sufficient for intrinsic lane-accurate localization, nor does it provide examples or guidance for evaluating when an area meets such criteria. Without disclosure of these specifics, the claim recites only the result of a process (identifying areas suitable for automated driving) rather than the process or structural features by which that result is achieved. Additionally, the claim recites “creating the digital map by supplementing the base map with data representing the identified areas.” The specification ([0004-0007, 0014-0015]) provides a general description describing certain areas based on sensor detectable features and lane configurations and that the digital map is a map with one or more map layers, it does not describe (i) how these determined areas are incorporated into the base map (e.g. in particular what algorithm(s) is used); (ii) what format or data model is used to represent the supplemented digital map; (iii) whether the process is performed offline, in real time or via a specific mapping system; and (iv) how the supplemented digital map is structured to support use in an automated driving system. The specification may describe map layers or general map organization but it does not disclose how the identified areas are derived from sensor-detectable reference structures or lane configurations, leaving the claimed supplementing step dependent on the same undisclosed criteria. See the 2019 35 U.S.C. 112 Compliance Federal Register Notice (Federal Register, Vol. 84, No. 4, Monday, January 7, 2019, pages 57 to 63). See also http://ptoweb.uspto.gov/patents/exTrain/documents/2019-112-guidance-initiative.pptx . Quoting the FR Notice at pages 61 and 62, "The Federal Circuit emphasized that ‘‘[t]he written description requirement is not met if the specification merely describes a ‘desired result.’ ’’ Vasudevan, 782 F.3d at 682 (quoting Ariad, 598 F.3d at 1349). . . . When examining computer-implemented, software-related claims, examiners should determine whether the specification discloses the computer and the algorithm(s) that achieve the claimed function in sufficient detail that one of ordinary skill in the art can reasonably conclude that the inventor possessed the claimed subject matter at the time of filing. An algorithm is defined, for example, as 'a finite sequence of steps for solving a logical or mathematical problem or performing a task.' Microsoft Computer Dictionary (5th ed., 2002). Applicant may 'express that algorithm in any understandable terms including as a mathematical formula, in prose, or as a flow chart, or in any other manner that provides sufficient structure.' Finisar, 523 F.3d at 1340 (internal citation omitted). It is not enough that one skilled in the art could theoretically write a program to achieve the claimed function, rather the specification itself must explain how the claimed function is achieved to demonstrate that the applicant had possession of it. See, e.g., Vasudevan, 782 F.3d at 682–83. If the specification does not provide a disclosure of the computer and algorithm(s) in sufficient detail to demonstrate to one of ordinary skill in the art that the inventor possessed the invention that achieves the claimed result, a rejection under 35 U.S.C. 112(a) for lack of written description must be made. See MPEP § 2161.01, subsection I." Accordingly, the Examiner believes that Applicant has not demonstrated to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. The dependent claims that dependent upon independent claims are also rejected under 112 first paragraph by the fact that they are dependent upon the rejected independent claims. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-7 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding Claim 1 (similarly independent claims 4-6) , the amended limitation “identifying…area as being designated for automated driving…absolute localization data” is indefinite for at least the following reasons: The limitation “identifying…area as being designated for automated driving” lacks objective criteria defining when an area qualifies as being “designated”. The claim does not specify measurable parameters, threshold or decision rules that distinguish designated areas from non-designated areas, leaving the boundary of this limitation indeterminate. The limitation “providing a minimum set of sensor-detectable reference structures” further renders the claim indefinite because the term “minimum” is a term of degree without objective baseline/metric, “sensor-detectable” depends on unspecified sensor types and operating conditions, and “reference structures” lacks structural definition. The limitation “enable lane-accurate intrinsic relative localization” fails to provide reasonable certainty as to claim scope because “lane accurate” is an undefined term of degree, “enable” describes a result rather than a boundary and “intrinsic relative localization” lacks a clear performance standard. The limitation provides no objective measure by which compliance with this limitation can be assessed. The limitation “…in absence of externally provided absolute localization” data is unclear because the claim does not define what constitute as external or absolute localization, nor whether partial, intermittent or indirect sources of localization are excluded. See Nautilus, Inc. v. Biosig Instruments, Inc. (U.S. Supreme Court, 2014) which held, "A patent is invalid for indefiniteness if its claims, read in light of the patent’s specification and prosecution history, fail to inform, with reasonable certainty, those skilled in the art about the scope of the invention." See also In re Packard, 751 F.3d 1307 (Fed.Cir.2014)(“[A] claim is indefinite when it contains words or phrases whose meaning is unclear,” i.e., “ambiguous, vague, incoherent, opaque, or otherwise unclear in describing and defining the claimed invention.”) and Ex Parte McAward, Appeal No. 2015-006416 (PTAB, Aug. 25, 2017, Precedential) (“Applying the broadest reasonable interpretation of a claim, then, the Office establishes a prima facie case of indefiniteness with a rejection explaining how the metes and bounds of a pending claim are not clear because the claim contains words or phrases whose meaning is unclear.”) The dependent claims that dependent upon independent claims are also rejected under 112 second paragraph by the fact that they are dependent upon the rejected independent claims. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-2, and 4-6 are rejected under 35 U.S.C. 103 as being unpatentable over Reschka et al. (US 2020/0310450 A1 hereinafter Reschka) in view of Wilbers et al. (US 2021/0341310 A1 hereinafter Wilbers). Regarding Claim 1 (similarly claims 4-5), Reschka discloses A computer-implemented method for creating a digital map for use in automated driving, (see at least Abstract), the method comprising the following steps: retrieving, by a processing system that includes at least one processor, map data values representing a base map, wherein the base map includes information on traffic routes with one or more lanes and surroundings features, wherein the information represents at least one configuration of the lanes, wherein the surroundings features represent geographical objects which can be detected by a surroundings sensor system of a vehicle; (see at least Fig. 1-6 [0009-0097]: receiving map data for a new or unverified region (e.g., the new region may be an extension of an already-mapped region or a completely new region) for which the autonomous vehicle is not yet been verified for travel wherein the map data may be any number of data types including representative information of a locale such as speed limits, traffic signals, stop signs, lane indicators, crosswalks, number of lanes etc. The map data may include-two dimensional and/or three-dimensional information about the new region. The map segmentation component can implement a number of technique for identifying segments in map data including image data or may use previously driving data from one or more vehicles. The computing device can receive sensor data from the vehicle and can generate and/or update maps based on the sensor data.) identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surroundings features; (see at least Fig. 1-6 [0009-0097]: The map segmentation component can identify portions of drivable surfaces in the unverified regions. Such segments may include junctions of connecting roads and/or connecting roads extending between junctions. The map segmentation component can perform image processing techniques to identify drivable surface and/or segments thereof. The detailed map data of a navigable (by the autonomous vehicle) region may include information about the physical extends and arrangements of the drivable surface. Such information can include lengths and widths of streets, layouts of intersections, severity of surface grades, positions and arrangements of buildings, fire hydrants, or other fixtures and other elements.) creating, by the processor system, the digital map by supplementing the base map with data representing the identified areas; and incorporating the digital map in the automated vehicle, the automated vehicle being configured to perform an automated driving operation based on the incorporated digital map. (see at least Fig. 1-6 [0009-0097]: the verified map data may include details maps used by one or more autonomous vehicles to travel in an area, for example, utilizing simultaneous localization and mapping technique. For example, the verified map data may include a 3D mesh of an environment, including a drivable surface, road marking information, traffic control information, or the like. The computing device(s) can receive the sensor data (raw or processed) and can generate and/or update maps based on the sensor data. For instance, the computing device(s) can compare map data of a new region, e.g., in which the autonomous vehicle is not verified for driving, to verified map data, e.g., from areas in which the autonomous vehicle is verified or otherwise configured for driving, to identify segments of the new region in which the autonomous vehicle can travel. For instance, the computing device(s) can generate updated map data including the determine-to-be navigable regions and supply or otherwise make available to the vehicle 402 the updated map data. Accordingly, the vehicle can be controlled to for identify regions in new, e.g., previously-unmapped for navigation, geographic areas shadows in images and can generate textured 3D maps without shadows.) it may be alleged that Reschka does not explicitly teach identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surrounding features providing a minimum set of sensor-detectable reference structures that enable lane-accurate intrinsic relative localization of the automated vehicle in absence of externally provided absolute localization data. Wilbers is directed to system and method for estimating the quality of localization using sensor detection, Wilbers teaches identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surrounding features providing a minimum set of sensor-detectable reference structures that enable lane-accurate intrinsic relative localization of the automated vehicle in absence of externally provided absolute localization data. (see at least Fig. 4-7 [0020-0021, 0066-0088]: The transportation vehicle scans the surrounding area. The distance to specific features of the surrounding area recorded in the map, which are also referred to as landmarks, are determined through evaluation of the images captured by the sensors which then results in an improved accuracy in the self-localization. The quality of localization is correspondingly dependent on the availability of detectable features of the surrounding area and the precision of the detection and the precision of the map entries. The process of estimating the quality of localization based on environmental features and their detectability by onboard sensors, specifically: it predicts whether a vehicle’s self-localization will succeed or fail in a given environment by estimating a global recognition impairment based on visibility and detectability of landmarks of surrounding features, the field of view and characteristics of vehicle sensors, and match between expected and observed features; the quality of localization is estimated based on the calculation of the covariance of the positions of the landmarks. That is, the suitability of a region for autonomous driving is quantified using measurable landmark data. The covariance of landmark positions provides an objective metric of whether landmarks enable lane-accurate intrinsic localization.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Reschka’s system and method for updating/generating new map data of previously unverified regions to incorporate the technique of determining, for each map region, whether the set of detectable landmarks provides sufficient localization confidence as taught by Wilbers with reasonable expectation of success to provide an improved method for identifying areas suitable for autonomous driving based on measurable sensor detectable features for enabling intrinsic localization. Regarding Claim 2, the combination Reschka in view of Wilbers discloses The method according to claim 1, Reschka further discloses wherein the information on the traffic routes includes a width of a relevant lane and/or a total width of a traffic route and/or properties of road markings of the relevant lane. (see at least Fig. 1-6 [0009-0097]: In some examples, techniques described herein can parse map data (e.g., verified map data and/or map data of a new region) into multiple segments of a drivable surface. For example, techniques of this disclosure may generate segments from the map data. In some instances, a segment can include a junction segment, e.g., an intersection, a merge, or the like, or a connecting road segment, e.g., an extent of a road between junctions. Systems described herein can also associate data with each of the individual segments. For instance, a junction segment may include a junction type, e.g., a merge, a “T,” a round-about, or the like; a number of roads meeting at the junction; a relative position of those roads, e.g., an angle between the roads meeting at the junction; information about traffic control signals at the junction; and/or other features. Data associated with a connecting road segment can include a number of lanes, a width of those lanes, a direction of travel in each of the lanes, an identification of parking lanes, a speed limit on the road segment, and/or other features.) Regarding Claim 6, Reschka discloses A method for operating an automated vehicle (see at least Abstract), comprising the following steps: determining a current position of the automated vehicle using satellite-based positioning; (see at least Fig. 1-6 [0009-0097]: the localization component can include functionality to receive data from sensor systems to determine a position of the vehicle and can utilize SLAM or CLAMS to receive image data, lidar data, radar data, imu data, GPS data, wheel encoder data and the like to accurately determine a location of the autonomous vehicle ) localizing the current position in a digital map (see at least Fig. 1-6 [0009-0097]: the planning component 424 can determine a route to travel from a first location (e.g., a current location) to a second location (e.g., a target location)) that is created by: retrieving map data values representing a base map, wherein the base map includes information on traffic routes with one or more lanes and surroundings features, wherein the information represents at least one configuration of the lanes, wherein the surroundings features represent geographical objects that are detectable by a surroundings sensor system of an automated vehicle; (see at least Fig. 1-6 [0009-0097]: receiving map data for a new or unverified region (e.g., the new region may be an extension of an already-mapped region or a completely new region) for which the autonomous vehicle is not yet been verified for travel wherein the map data may be any number of data types including representative information of a locale such as speed limits, traffic signals, stop signs, lane indicators, crosswalks, number of lanes etc. The map data may include-two dimensional and/or three-dimensional information about the new region. The map segmentation component can implement a number of technique for identifying segments in map data including image data or may use previously driving data from one or more vehicles. The computing device can receive sensor data from the vehicle and can generate and/or update maps based on the sensor data.) identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surroundings features; (see at least Fig. 1-6 [0009-0097]: The map segmentation component can identify portions of drivable surfaces in the unverified regions. Such segments may include junctions of connecting roads and/or connecting roads extending between junctions. The map segmentation component can perform image processing techniques to identify drivable surface and/or segments thereof. The detailed map data of a navigable (by the autonomous vehicle) region may include information about the physical extends and arrangements of the drivable surface. Such information can include lengths and widths of streets, layouts of intersections, severity of surface grades, positions and arrangements of buildings, fire hydrants, or other fixtures and other elements.) creating the digital map by supplementing the base map with data representing the identified areas; and based on the localization, determining a driving strategy that includes a trajectory from the current position to one of the identified areas; and operating the automated vehicle according to the driving strategy. (see at least Fig. 1-6 [0009-0097]: the planning component 424 can determine various routes and trajectories and various levels of detail. For example, the planning component 424 can determine a route to travel from a first location (e.g., a current location) to a second location (e.g., a target location). For the purpose of this discussion, a route can be a sequence of waypoints for travelling between two locations. In at least one example, the map(s) 428 may include at least one map (e.g., images and/or a mesh) generated in accordance with the techniques discussed herein. For example, the map(s) 428 may include information about drivable regions in a new environment, wherein the drivable regions are determined according to the techniques described herein. In some instances, the map(s) 428 may include information only about drivable regions, whereas other implementations can include map data of an entire region, including map data of undrivable regions. In some example, the vehicle 402 can be controlled based at least in part on the map(s) 428 . That is, the maps 428 can be used in connection with the localization component 420 , the perception component 422 , and/or the planning component 424 to determine a location of the vehicle 402 , identify objects in an environment, and/or generate routes and/or trajectories to navigate within an environment.) it may be alleged that Reschka does not explicitly teach identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surrounding features providing a minimum set of sensor-detectable reference structures that enable lane-accurate intrinsic relative localization of the automated vehicle in absence of externally provided absolute localization data. Wilbers is directed to system and method for estimating the quality of localization using sensor detection, Wilbers teaches identifying, by the processor system, areas as being designated for automated driving based on the configuration of the lanes and the surrounding features providing a minimum set of sensor-detectable reference structures that enable lane-accurate intrinsic relative localization of the automated vehicle in absence of externally provided absolute localization data. (see at least Fig. 4-7 [0020-0021, 0066-0088]: The transportation vehicle scans the surrounding area. The distance to specific features of the surrounding area recorded in the map, which are also referred to as landmarks, are determined through evaluation of the images captured by the sensors which then results in an improved accuracy in the self-localization. The quality of localization is correspondingly dependent on the availability of detectable features of the surrounding area and the precision of the detection and the precision of the map entries. The process of estimating the quality of localization based on environmental features and their detectability by onboard sensors, specifically: it predicts whether a vehicle’s self-localization will succeed or fail in a given environment by estimating a global recognition impairment based on visibility and detectability of landmarks of surrounding features, the field of view and characteristics of vehicle sensors, and match between expected and observed features; the quality of localization is estimated based on the calculation of the covariance of the positions of the landmarks. That is, the suitability of a region for autonomous driving is quantified using measurable landmark data. The covariance of landmark positions provides an objective metric of whether landmarks enable lane-accurate intrinsic localization.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Reschka’s system and method for updating/generating new map data of previously unverified regions to incorporate the technique of determining, for each map region, whether the set of detectable landmarks provides sufficient localization confidence as taught by Wilbers with reasonable expectation of success to provide an improved method for identifying areas suitable for autonomous driving based on measurable sensor detectable features for enabling intrinsic localization. Claim(s) 3 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Reschka in view of Wilbers and Viswanathan (US2020/0202143 A1). Regarding Claim 3, the combination of Reschka in view of Wilbers teaches The method according to claim 1, further comprising It may be alleged that the combination of Reschka in view of Wilbers does not explicitly teach retrieving sensor information, wherein the sensor information represents at least detection areas of different sensor types, and the areas are determined depending on the detection areas in relation to a spatial arrangement of the lanes. Viswanathan is directed to vision based mapping, Viswanathan teaches retrieving sensor information, wherein the sensor information represents at least detection areas of different sensor types, and the areas are determined depending on the detection areas in relation to a spatial arrangement of the lanes. (see at least Fig. 1-6 [0029-0072]: Autonomous vehicles may also be equipped with a plurality of sensors to facilitate autonomous vehicle control. Sensors may include image sensors/cameras, LiDAR, GPS, Inertial Measurement Units (IMUs), or the like which may measure the surroundings of a vehicle and communicate information regarding the surroundings to a vehicle control module to process and adapt vehicle control accordingly. HD maps may be generated and updated based on sensor data from vehicles traveling along road segments of a road network. These vehicles may have various degrees of autonomy and may be equipped with a variety of different levels of sensors. Sensors from fully autonomous vehicles, for example, may be used to update map data or generate new map data in a form of crowd-sourced data from vehicles traveling along road segments. Sensor data received can be compared against other sensor data relating to the images captured by sensors to establish the accuracy of sensor data and to confirm the position, size, shape, etc. of features and objects along the road segment.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Reschka and Wilbers to incorporate the technique of retrieving sensor information, wherein the sensor information represents at least detection areas of different sensor types, and the areas are determined depending on the detection areas in relation to a spatial arrangement of the lanes as taught by Viswanathan with reasonable expectation of success to improving the efficiency of localization through a reduction in the search space explored to establish an accurate location more effectively (Viswanathan [0001]). Regarding Claim 7, the combination of Reschka in view of Wilbers teaches The method according to claim 1, further comprising It may be alleged that Reschka does not explicitly teach retrieving, by the processing system, sensor-capability information describing detection regions of one or more sensor types of the automated vehicle, wherein identifying the areas designated for automated driving comprises, for each of a plurality of candidate areas, Wilbers is directed to system and method for estimating the quality of localization using sensor detection, Wilbers teaches retrieving, by the processing system, sensor-capability information describing detection regions of one or more sensor types of the automated vehicle, wherein identifying the areas designated for automated driving comprises, for each of a plurality of candidate areas, (see at least Fig. 3 [0060-0071]: The transportation vehicle is equipped with two environment sensors: the video camera and the LIDAR sensors where in the environment sensors are capable of detecting the surrounding area of the transportation vehicle are to be used for different distances and different purposes.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Reschka’s system and method for updating/generating new map data of previously unverified regions to incorporate the technique of retrieving sensor-capability information describing detection regions of one or more sensor types of the automated vehicle, wherein identifying the areas designated for automated driving comprises, for each of a plurality of candidate areas as taught by Wilbers with reasonable expectation of success to provide an improved method for identifying areas suitable for autonomous driving based on measurable sensor detectable features for enabling intrinsic localization. It may be alleged that the combination of Reschka in view of Wilbers does not explicitly teach determining, based on a respective spatial arrangement of the lanes in the respective candidate area, whether those of the surrounding features that provide a minimum set of sensor-detectable reference structures for carrying out the intrinsic relative localization lie within the detection regions of the corresponding sensor types. Viswanathan is directed to vision based mapping, Viswanathan teaches determining, based on a respective spatial arrangement of the lanes in the respective candidate area, whether those of the surrounding features that provide a minimum set of sensor-detectable reference structures for carrying out the intrinsic relative localization lie within the detection regions of the corresponding sensor types. (see at least Fig. 1-6 [0029-0072]: Autonomous vehicles may also be equipped with a plurality of sensors to facilitate autonomous vehicle control. Sensors may include image sensors/cameras, LiDAR, GPS, Inertial Measurement Units (IMUs), or the like which may measure the surroundings of a vehicle and communicate information regarding the surroundings to a vehicle control module to process and adapt vehicle control accordingly. HD maps may be generated and updated based on sensor data from vehicles traveling along road segments of a road network. These vehicles may have various degrees of autonomy and may be equipped with a variety of different levels of sensors. Sensors from fully autonomous vehicles, for example, may be used to update map data or generate new map data in a form of crowd-sourced data from vehicles traveling along road segments. Sensor data received can be compared against other sensor data relating to the images captured by sensors to establish the accuracy of sensor data and to confirm the position, size, shape, etc. of features and objects along the road segment.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Reschka and Wilbers to incorporate the technique of determining, based on a respective spatial arrangement of the lanes in the respective candidate area, whether those of the surrounding features that provide a minimum set of sensor-detectable reference structures for carrying out the intrinsic relative localization lie within the detection regions of the corresponding sensor types as taught by Viswanathan with reasonable expectation of success to improving the efficiency of localization through a reduction in the search space explored to establish an accurate location more effectively (Viswanathan [0001]). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANA F ARTIMEZ/Examiner, Art Unit 3667 /FARIS S ALMATRAHI/Supervisory Patent Examiner, Art Unit 3667
Read full office action

Prosecution Timeline

Jul 09, 2024
Application Filed
Sep 24, 2025
Non-Final Rejection — §103, §112
Jan 02, 2026
Response Filed
Feb 10, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596371
SYSTEM AND METHOD FOR INTERCEPTION AND COUNTERING UNMANNED AERIAL VEHICLES (UAVS)
2y 5m to grant Granted Apr 07, 2026
Patent 12573078
METHOD AND APPARATUS FOR DETERMINING VEHICLE LOCATION BASED ON OPTICAL CAMERA COMMUNICATION
2y 5m to grant Granted Mar 10, 2026
Patent 12571646
Automated Discovery and Monitoring of Uncrewed Aerial Vehicle Ground-Support Infrastructure
2y 5m to grant Granted Mar 10, 2026
Patent 12560441
METHOD AND APPARATUS FOR OPTIMIZING A MULTI-STOP TOUR WITH FLEXIBLE MEETING LOCATIONS
2y 5m to grant Granted Feb 24, 2026
Patent 12560936
SYSTEMS AND METHODS FOR OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+43.9%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 80 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month