Prosecution Insights
Last updated: April 19, 2026
Application No. 18/898,808

METHOD FOR ASSISTING IN THE CREATION OF AN ELEVATION MAP

Non-Final OA §101§102§103
Filed
Sep 27, 2024
Examiner
COOLEY, CHASE LITTLEJOHN
Art Unit
3662
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Robert Bosch GmbH
OA Round
1 (Non-Final)
67%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
88%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
116 granted / 173 resolved
+15.1% vs TC avg
Strong +20% interview lift
Without
With
+20.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
46 currently pending
Career history
219
Total Applications
across all art units

Statute-Specific Performance

§101
12.7%
-27.3% vs TC avg
§103
52.6%
+12.6% vs TC avg
§102
19.0%
-21.0% vs TC avg
§112
14.2%
-25.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 173 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-11 of US Application No. 18/898,808, filed on 09/27/2024, are currently pending and have been examined. Information Disclosure Statement The information Disclosure Statements filed on 09/27/2024 and 11/01/2024 have been considered. An initialed copy of form 1449 for each is enclosed herewith. Specification The disclosure is objected to because of the following informalities: On page 11, line 10 the specification states: “…within the meaning of 11escryiption is integrated in the camera…” The examiner believes this to be a typo and it should read “…within the meaning of the description is integrated in the camera…” Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “an ascertaining device configured to ascertain an elevation…” in claims 7 and 8. Because these claim limitation(s) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. Here, the term “ascertaining device” will be interpreted in light of “An ascertaining device within the meaning of the [descri]ption is integrated in the camera, for example. This means, therefore, that according to one embodiment, it can be provided that the camera itself ascertains the elevation of the location in the surroundings of the land vehicle relative to a reference location based on the image data.” Pg. 11 in the instant specification. If applicant does not intend to have these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-11 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Step 1 Claim 1 is directed towards a method for assisting in a creation of an elevation map. Claim 7 is directed towards an apparatus configured to assist in a creation of an elevation map. Claim 8 is directed towards a land vehicle. Claim 9 is directed towards a map server. Claim 10 is directed towards a method for creating an elevation map. Claim 11 is directed towards a non-transitory machine-readable storage medium on which is stored a computer program for assisting in the creation of an elevation map. Step 2A, Prong 1 A claim that recites an abstract idea, a law of nature, or a natural phenomenon is directed to a judicial exception. Abstract ideas include the following groupings of subject matter, when recited as such in a claim limitation: (a) Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations; (b) Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and (c) Mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion). See the 2019 Revised Patent Subject Matter Eligibility Guidance. In the instant application, independent claim 1 recites: “…detecting surroundings of a land vehicle… Ascertaining an elevation of a location in the surrounding of the land vehicle relative to a reference location base on the image data…” Independent claims 7, 8, and 11 recites substantially similar limitations. In the instant application, independent claim 9 recites: “…create an elevation map based on the ascertained elevation.” Independent claim 10 recites substantially similar limitations. These claim limitations, when given their broadest reasonable interpretation, may be performed in the human mind. Therefore these limitations are abstract ideas and claims 1 and 7-11 are directed to a judicial exception. Step 2A, Prong 2 Even when a judicial element is recited in the claim, an additional claim element(s) that integrates the judicial exception into a practical application of that exception renders the claim eligible under §101. A claim that integrates a judicial exception into a practical application will apply, rely on, or use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The following examples are indicative that an additional element or combination of elements may integrate the judicial exception into a practical application: the additional element(s) reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field; the additional element(s) that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition; the additional element(s) implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim; the additional element(s) effects a transformation or reduction of a particular article to a different state or thing; and the additional element(s) applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. Examples in which the judicial exception has not been integrated into a practical application include: the additional element(s) merely recites the words ‘‘apply it' ' (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea; the additional element(s) adds insignificant extra-solution activity to the judicial exception; and the additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use. See the 2019 Revised Patent Subject Matter Eligibility Guidance. In the instant application, claims 1 and 7-11 do not recite additional elements that integrate the judicial exception into a practical application of that exception. Claims 9 and 10 recite “a processor device” at a high level. The specification identifies the processor generically as a general processor, i.e., one or more processors – See specification at pg. 12. The processor(s) is merely a computer used as a tool to perform the abstract idea. Claims 1, 7, 8, and 11 recite a “camera” generically, e.g., a camera already installed on a vehicle – See at least pg. 4-5. Thus, the camera is merely a generic device for data gathering. Claims 1 and 7-11 recite “a map server”, however, this server is described as a generic cloud computing device – See at least pg. 12. Claim 11 further recites a “non-transitory machine readable storage medium on which is stored a computer program...” These combinations of elements also merely describe a generic computer that is used as a tool to perform the abstract idea. These steps are not meaningful limitations on the judicial exception. The processor, camera, server, storage medium, and computer program are recited so generically (no details whatsoever are provided other than that they are a processor, camera, server, storage medium, and computer program) that they represent no more than mere instructions to apply the judicial exception on a computer. These limitations can also be viewed as nothing more than an attempt to generally link the use of the judicial exception to the technological environment of a computer. It should be noted that because the courts have made it clear that mere physicality or tangibility of an additional element or elements is not a relevant consideration in the eligibility analysis, the physical nature of these computer components does not affect this analysis. See MPEP 2106.05(I) for more information on this point, including explanations from judicial decisions including Alice Corp. Pty. Ltd. v. CLS Bank Int'l, 573 U.S. 208, 224-26 (2014). Therefore, claims 1 and 7-11 do not recite additional elements that integrate the judicial exception into a practical application of that exception. Claim 1 further recites: “… ascertaining image data based on the detection, by using a camera of the land vehicle…” This limitation is a form of extra-solution activity, i.e., mere data gathering, and fails to integrate the judicial exception into a practical application of that exception. Claims 7, 8, and 11 recite substantially similar limitations and the same analysis above applies to them. Claims 1 and 7-11 recite a communication network that is used to send and receive data between a remote server and the vehicle. The act of sending and receiving data is also an extra-solution activity, i.e., mere data gathering, and fails to integrate the judicial exception into a practical application of that exception. Step 2B Finally, even when a judicial element is recited in the claim, an additional claim element(s) that amounts to significantly more than the judicial exception renders the claim eligible under §101. Examples that are not enough to amount to significantly more than the abstract idea include 1) mere instructions to implement the abstract idea on a computer, 2) simply appending well-understood, routine and conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well understood, routine and conventional activities previously known to the industry, 3) adding insignificant extra-solution activity to the judicial exception, and 4) generally linking the use of the judicial exception to a particular technological environment or field of use are not enough to amount to significantly more than the abstract idea. Examples of generic computing functions that are not enough to amount to significantly more than the abstract idea include 1) performing repetitive calculations, 2) receiving, processing, and storing data, 3) electronically scanning or extracting data from a physical document, 4) electronic recordkeeping, 5) automating mental tasks, and 6) receiving or transmitting data over a network, e.g., using the Internet to gather data. In the instant application, claims 1 and 7-11 do not include additional elements that are sufficient to amount to significantly more than the judicial exception. In this particular application, the same analysis above in determining whether the recited additional elements integrate the judicial exception into a practical application of that exception is applicable to determine if the additional elements amount to significantly more than the judicial exception. Based on the above analysis, claims 1 and 7-11 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 2 recites: “wherein the land vehicle is a motor vehicle or a rail vehicle.” Which further defines an abstract idea identified above. However, the claim does not recite any additional elements and, therefore, does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim 3 recites: “wherein the elevation of the location is ascertained using the following formula: e=l*cos β*tan α, where e is an elevation of the location, where l is a depth of the location in relation to the camera, where β is a tilt angle of the camera, where α is a complementary angle to an angle between the horizon and a surface normal at the location for which the elevation is to be ascertained.” Which further defines an abstract idea identified above. However, the claim does not recite any additional elements and, therefore, does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim 4 recites: “wherein a current position of the land vehicle is ascertained based on the image data, wherein a data set is ascertained which includes the current position of the land vehicle and the ascertained elevation...” Which further defines an abstract idea identified above. However, the claim does not recite any additional elements and, therefore, does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim 4 further recites: “…wherein the data set is sent to the remote map server via the communication network.” which is a form of extra solution activity. The claim does not recite any additional elements and, therefore, does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim 5 recites additional abstract ideas that may be performed mentally, i.e., “…wherein, based on the image data, a landmark in the surroundings of the land vehicle is recognized, based on which the current position of the land vehicle is ascertained.” In the instant application claim 5 recites the same elements as claim 1. However, as in claim 1 the elements are disclosed at a high level of generality. Therefore, the elements are no more than a generic computing element that is performing a generic computing activity. Thus, claim 5 does not recite elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim 6 recites: “…wherein a plurality of ascertained elevations from a plurality of locations in the surroundings of the land vehicle relative to the reference location are packed into a data block, which is sent to the remote map server via the communication network.” which is a form of extra solution activity. The claim does not recite any additional elements and, therefore, does not recite any additional elements that integrate the judicial exception into a practical application of that exception or amount to significantly more than the judicial exception. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 2, 4, and 7-11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kubiak et al. (US 2019/0226853 A1, “Kubiak”). Regarding claims 1, 7, 8, and 11, Kubiak discloses methods and systems for generating and using localization reference data and teaches: A land vehicle, comprising: (The invention is directed towards a road travelling vehicle, i.e., a land vehicle, as shown in Fig. 3 and Fig. 4) an apparatus configured to assist in a creation of an elevation map, including: (One exemplary technique for collecting the data to build such planning maps is to use mobile mapping systems; an example of which is depicted in FIG.3. The mobile mapping system 2 comprises a survey vehicle 4, a digital camera 40 and a laser scanner 6 mounted on the roof 8 of the vehicle 4. The survey vehicle 2 further comprises a processor 10, a memory 12 and a transceiver 14. In addition, the survey vehicle 2 comprises an absolute positioning device 2, such as a GNSS receiver, and a relative positioning device 22 including an inertial measurement unit (IMU) and a distance measurement instrument (DMI) – See at least ¶ [0007]) a camera that is configured to detect surroundings of a land vehicle (The mobile mapping system 2 comprises a survey vehicle 4, a digital camera 40 and a laser scanner 6 mounted on the roof 8 of the vehicle 4 – See at least ¶ [0007] and Fig. 3) and to ascertain image data based on the detection, (In addition, the camera 40 repeatedly captures images of the road surface 32 to provide a plurality of road surface images; the processor 10 adding a timestamp to each image and storing the images in the memory 12 – See at least ¶ [0008]) an ascertaining device configured to ascertain an elevation of a location in the surroundings of the land vehicle relative to a reference location based on the image data, and (As can be seen from FIGS. 9, 10B, 11 and 12, the localization reference data and the sensed environment data preferably are in the form of depth maps, wherein each element (e.g. pixel when the depth map is stored as an image) comprises: a first value indicative of a longitudinal position (along a road); a second value indicative of an elevation i.e. a height above ground); and a third value indicative of a lateral position (across a road). Each element, e.g. pixel, of the depth map therefore effectively corresponds to a portion of a surface of the environment around the vehicle – See at least ¶ [0287]) a communication device configured to send the ascertained elevation of the location via a communication network to a remote map server, to assist the remote map server in the creation of the elevation map. (In other embodiments, wherein the localization reference data is stored remotely from the vehicle, the sensed environment data can be sent to a server over a wireless connection, e.g. via the mobile telecommunications network. The server, which has access to the localization reference data, would then return any determined offset back to the vehicle – See at least ¶ [0288]; The methods of the present invention are, in preferred embodiments, implemented by a server or similar computing device. In other words, the methods of the presented invention are preferably computer implemented methods. Thus, in embodiments, the system of the present invention comprises a server or similar computing device comprising the means for carrying out the various steps described, and the method steps described herein are carried out by a server – See at least ¶ [0217]; Examiner notes that creating an elevation map is one of the methods.) Regarding claim 2, Kubiak further teaches: wherein the land vehicle is a motor vehicle or a rail vehicle. (The invention is directed towards a car, i.e., a motor vehicle – See at least ¶ [0175]) Regarding claim 4, Kubiak further teaches: wherein a current position of the land vehicle is ascertained based on the image data, (In some preferred embodiments the data is used in determining a position of a vehicle relative to the digital map. The digital map thus comprises data representative of navigable elements along which the vehicle is travelling. The method may comprise obtaining the localization reference data associated with the digital map for a deemed current position of the vehicle along a navigable element of the navigable network; determining real time scan data by scanning the environment around the vehicle using at least one sensor, wherein the real time scan data comprises at least one depth map indicative of an environment around the vehicle, each pixel of the at least one depth map being associated with a position in the reference plane associated with the navigable element, and the pixel including a depth channel representing the distance to a surface of an object in the environment along the predetermined direction from the associated position of the pixel in the reference plane as determined using the at least one sensor; calculating a correlation between the localization reference data and the real time scan data to determine an alignment offset between the depth maps; and using the determined alignment offset to adjust the deemed current position to determine the position of the vehicle relative to the digital map – See at least ¶ [0064] The sensor include camera image data – See at least ¶ [0098]) wherein a data set is ascertained which includes the current position of the land vehicle and the ascertained elevation, wherein the data set is sent to the remote map server via the communication network. (In other embodiments, wherein the localization reference data is stored remotely from the vehicle, the sensed environment data can be sent to a server over a wireless connection, e.g. via the mobile telecommunications network. The server, which has access to the localization reference data, would then return any determined offset back to the vehicle – See at least ¶ [0288]) Regarding claims 9 and 10, Kubiak discloses methods and systems for generating and using localization reference data and teaches: A map server configured to creating an elevation map, comprising: (The methods of the present invention are, in preferred embodiments, implemented by a server or similar computing device. In other words, the methods of the presented invention are preferably computer implemented methods. Thus, in embodiments, the system of the present invention comprises a server or similar computing device comprising the means for carrying out the various steps described, and the method steps described herein are carried out by a server – See at least ¶ [0217]) a communication device configured to receive from a land vehicle an ascertained elevation of a location in surroundings of the land vehicle relative to a reference location via a communication network; and (In other embodiments, wherein the localization reference data is stored remotely from the vehicle, the sensed environment data can be sent to a server over a wireless connection, e.g., via the mobile telecommunications network – See at least ¶ [0288]) a processor device configured to create an elevation map based on the ascertained elevation. (The server, which has access to the localization reference data, would then return any determined offset back to the vehicle – See at least ¶ [0215]) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claim(s) 3 is rejected under 35 U.S.C. 103 as being unpatentable over Kubiak, as applied to claim 1, and in further view of Cuemath (Angle of Elevation, “Cuemath”). Regarding claim 3, Kubiak does not explicitly teach wherein the elevation of the location is ascertained using the following formula: e=l*cos β*tan α, where e is an elevation of the location, where l is a depth of the location in relation to the camera, where β is a tilt angle of the camera, where α is a complementary angle to an angle between the horizon and a surface normal at the location for which the elevation is to be ascertained. However, Cuemath discloses trigonometric relationships and teaches: wherein the elevation of the location is ascertained using the following formula: e=l*cos β*tan α, where e is an elevation of the location, where l is a depth of the location in relation to the camera, where β is a tilt angle of the camera, where α is a complementary angle to an angle between the horizon and a surface normal at the location for which the elevation is to be ascertained. (The l*cos β portion of the equation is stating the relationship cos θ = x/z. Rearranging the relationship gives us z* cos θ = x or x = z* cos θ. Here x is the elevation, z is the depth location relative to the observer camera, and θ is the angle of the observer, or the tilt angle of the camera. This assumes a right triangle, because objects may not be right triangles, then further calculation is needed to determine a heigh at the opposite angle, i.e., complimentary angle. To do this the relationship tan θ = y/x (where y is the surface normal at the location and x is the horizon). Putting these relationships to find the correct elevation gives us x = z*cos θ*tan(y/x). This derived formula is equivalent to formula presented by the applicant and the derivation is similar to that which was performed on pg. 17 of the instant application. Please reference the image below. ) PNG media_image1.png 386 550 media_image1.png Greyscale Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to have modified the methods and systems for generating and using localization reference data of Kubiak to provide for the trigonometric relationships, as taught in Cuemath, to determine elevations using fundamental trigonometric relationships. Claim(s) 5 is rejected under 35 U.S.C. 103 as being unpatentable over Kubiak, as applied to claim 1, and in further view of Homann et al. (US 2022/0236073, “Homann”). Regarding claim 5, Kubiak does not explicitly teach wherein, based on the image data, a landmark in the surroundings of the land vehicle is recognized, based on which the current position of the land vehicle is ascertained. However, Homann discloses method for creating a universally useable feature map and teaches: wherein, based on the image data, (With the aid of the method according to the present invention, it is possible, in particular, to ascertain and extract static features of surroundings. Such features may, for example, be roadway markings, geometric shapes of buildings, curbs, roads, arrangement and position of traffic lights, guide posts, roadway boundaries, buildings, containers and the like. Such features may be detected by different sensors and may be used for a localization. For example, extracted features from measured data of a LIDAR sensor may also be detected by camera sensors and compared with one another for the purpose of localization – See at least ¶ [0019]) a landmark in the surroundings of the land vehicle is recognized, based on which the current position of the land vehicle is ascertained. (According to one further specific embodiment of the present invention, the extracted features are stored as universally ascertainable features in the feature map. According to one advantageous embodiment, the features are extracted and stored as geometric shapes, lines, points and/or point clouds and the like. Thus, objects, markings and characteristic or distinctive shapes may be extracted from the measured data of the surroundings and used for carrying out localizations. In this way, a plurality of static features may, in particular, also be ascertained in dynamic surroundings and may be used for precisely ascertaining a position – See at least ¶ [0026]) In summary, Kubiak discloses identifying, via a camera system, road markings and other environmental objects. Kubiak does not explicitly disclose recognizing a landmark. However, Homann discloses method for creating a universally useable feature map and teaches extracting feature objects with distinctive markings, characteristics, or shapes to create a map which is used to determine the position of the vehicle relative to the objects. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to have modified the methods and systems for generating and using localization reference data of Kubiak to provide for the method for creating a universally useable feature map, as taught in Homann, to reduce the volume of data required when providing the feature map to vehicles or robots. (At Homann ¶ [0021]) Regarding claim 6, Kubiak further teaches: wherein a plurality of ascertained elevations from a plurality of locations in the surroundings of the land vehicle relative to the reference location are packed into a data block, [] (The localization reference data may be stored in a compressed format. The localization reference data may have a size that corresponds to 30KB/km or less, i.e., a data block – See at least ¶ [0209] Examiner notes that the localization reference data includes depth maps, i.e., elevations from a plurality of locations in the surroundings of the land vehicle – See at least ¶ [0015]) Kubiak does not explicitly teach that the data blocks are sent to the server. Instead Kubiak teaches communication between a vehicle and server and the server sending the data blocks to the vehicle. However, Wheeler discloses high definition map updates based on sensor data collected by autonomous vehicles and teaches: wherein [] the surroundings of the land vehicle relative to the reference location are packed into a data [] (A vehicle 150 includes vehicle sensors 105, vehicle controls 130, and a vehicle computing system 120. The vehicle sensors 105 allow the vehicle 150 to detect the surroundings of the vehicle as well as information describing the current state of the vehicle, for example, information describing the location and motion parameters of the vehicle. The vehicle sensors 105 comprise a camera, a light detection and ranging sensor (LIDAR), a global positioning system ( GPS ) navigation system, an inertial measurement unit (IMU), and others. The vehicle has one or more cameras that capture images of the surroundings of the vehicle. A LIDAR surveys the surroundings of the vehicle by measuring distance to a target by illuminating that target with a laser light pulses, and measuring the reflected pulses. The GPS navigation system determines the position of the vehicle based on signals from satellites . An IMU is an electronic device that measures and reports motion data of the vehicle such as velocity, acceleration, direction of movement, speed, angular rate, and so on using a combination of accelerometers and gyroscopes or other measuring instruments – See at least ¶ [0036]) which is sent to the remote map server via the communication network. (The online HD map system 110 receives sensor data captured by sensors of the vehicles, and combines the data received from the vehicles 150 to generate and maintain HD maps… In an embodiment, the online HD map system 110 is implemented as a distributed computing system, for example, a cloud based service that allows clients such as vehicle computing systems 120 to make requests for information and services – See at least ¶ [0030]) Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to have modified the methods and systems for generating and using localization reference data of Kubiak to provide for the high definition map updates based on sensor data collected by autonomous vehicles, as taught in Wheeler, to provide the right data that is sufficiently accurate and up-to date for safe navigation. (At Wheeler ¶ [0005]) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHASE L COOLEY whose telephone number is (303)297-4355. The examiner can normally be reached Monday-Thursday 7-5MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached at 571-270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /C.L.C./Examiner, Art Unit 3662 /ANISS CHAD/Supervisory Patent Examiner, Art Unit 3662
Read full office action

Prosecution Timeline

Sep 27, 2024
Application Filed
Dec 26, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592154
CONTROL DEVICE, MONITORING SYSTEM, CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12570125
TRIP INFORMATION CONTROL SCHEME
2y 5m to grant Granted Mar 10, 2026
Patent 12545274
PEER-TO-PEER VEHICULAR PROVISION OF ARTIFICIALLY INTELLIGENT TRAFFIC ANALYSIS
2y 5m to grant Granted Feb 10, 2026
Patent 12545302
SYSTEM, METHOD AND DEVICES FOR AUTOMATING INSPECTION OF BRAKE SYSTEM ON A RAILWAY VEHICLE OR TRAIN
2y 5m to grant Granted Feb 10, 2026
Patent 12539858
APPARATUS AND METHOD FOR DETERMINING CUT-IN OF VEHICLE
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
67%
Grant Probability
88%
With Interview (+20.4%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 173 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month