Prosecution Insights
Last updated: April 19, 2026
Application No. 18/310,927

SENSOR-BASED MAP CORRECTION

Non-Final OA §103
Filed
May 02, 2023
Examiner
KUNTZ, JEWEL A
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Qualcomm Incorporated
OA Round
3 (Non-Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
2y 12m
To Grant
80%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
49 granted / 68 resolved
+20.1% vs TC avg
Moderate +8% lift
Without
With
+7.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
35 currently pending
Career history
103
Total Applications
across all art units

Statute-Specific Performance

§101
29.0%
-11.0% vs TC avg
§103
52.0%
+12.0% vs TC avg
§102
11.8%
-28.2% vs TC avg
§112
6.6%
-33.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 68 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/20/2026 has been entered. Status of the Claims The claims 1, 3-8, 10, 11, 13-16, 18-23, 25, 28-37 are currently pending and have been examined. Applicant amended claims 1, 13, 15, 16, 28, 30, and 31. Response to Arguments/Amendments The amendment filed June 26, 2025 has been entered. Claims 1, 3-8, 10, 11, 13-16, 18-23, 25, 28-37 are currently pending in the Application. Applicant’s amendments to the claims have overcome the 35 U.S.C. 101 rejection set forth in the Final Rejection mailed October 27th, 2025. Applicant’s arguments with respect to claim(s) 1, 3-8, 10, 11, 13-16, 18-23, 25, 28-37 under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 4, 7, 10, 11, 13-16, 19, 22, 25, 28, 29-31, 33, 34, and 36 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ganjineh (US 20200098135 A1) in view of Wheeler (US 20180188045 A1). Regarding Claim 1, Ganjineh teaches An apparatus for processing one or more maps, comprising: at least one memory (See at least paragraph [0133], “The system may further comprise data storage means, such as computer memory, for storing, for example, the at least one repository including the instructive and informative data. Any of the methods in accordance with the present invention may be implemented at least partially using software, e.g. computer programs. The present invention thus also extends to a computer program product comprising computer readable instructions executable to perform, or to cause a system and/or server to perform a method according to any of the aspects or embodiments of the invention. Thus, the present invention extends to a, preferably non-transitory, computer program product comprising computer readable instructions executable when run on a system in accordance with any of the embodiments of the invention to cause a set of one or processors of the system to perform the steps of any of the aspects or embodiments of the method described herein.”); and at least one processor coupled to the at least one memory, the at least one processor configured to: obtain a first map of an environment in which the apparatus is located (See at least paragraph [0133], “The system may further comprise data storage means, such as computer memory, for storing, for example, the at least one repository including the instructive and informative data. Any of the methods in accordance with the present invention may be implemented at least partially using software, e.g. computer programs. The present invention thus also extends to a computer program product comprising computer readable instructions executable to perform, or to cause a system and/or server to perform a method according to any of the aspects or embodiments of the invention. Thus, the present invention extends to a, preferably non-transitory, computer program product comprising computer readable instructions executable when run on a system in accordance with any of the embodiments of the invention to cause a set of one or processors of the system to perform the steps of any of the aspects or embodiments of the method described herein” and paragraph [0159], “A reference map section 16 is extracted from the previously compiled internal map repository 18.”); obtain sensor data from one or more sensors (See at least paragraph [0049], “In embodiments, one or more stereo cameras may be used to obtain (at least some of) the plurality of images. However, the use of a stereo camera is not necessary, and in some embodiments the images may be obtained using (only) one or more monocular or single cameras. In some preferred embodiments, the one or more cameras may comprise one or more stereo cameras and one or more single (monocular) cameras. For instance, in embodiments, as explained below, a plurality of stereo images obtained using stereo cameras may advantageously be used for the purposes of visual odometry. However, for the purposes of identifying and classifying objects in the images, it may be preferred to use images obtained from single (monocular) cameras” and paragraph [0050], “The image data may be supplemented by data from various other sensors, as desired. For instance, positional data, such as GNSS data, may be used to provide a coarse localisation of the vehicle and a timestamp for each of the images.”); generate a second map of the environment based on the sensor data (See at least paragraph [0049], “In embodiments, one or more stereo cameras may be used to obtain (at least some of) the plurality of images. However, the use of a stereo camera is not necessary, and in some embodiments the images may be obtained using (only) one or more monocular or single cameras. In some preferred embodiments, the one or more cameras may comprise one or more stereo cameras and one or more single (monocular) cameras. For instance, in embodiments, as explained below, a plurality of stereo images obtained using stereo cameras may advantageously be used for the purposes of visual odometry. However, for the purposes of identifying and classifying objects in the images, it may be preferred to use images obtained from single (monocular) cameras”, paragraph [0050], “The image data may be supplemented by data from various other sensors, as desired. For instance, positional data, such as GNSS data, may be used to provide a coarse localisation of the vehicle and a timestamp for each of the images”, and paragraph [0051], “Because the images in the sequence of images are generally obtained at different locations (e.g. reflecting the movement of the vehicle, and hence the associated camera or cameras, through the road network), in order to process the images together to generate a consistent local map representation of the area within which the vehicle is travelling it is necessary to know (or at least be able to determine) the camera location associated with each of the images. That is, each of the images represents a two-dimensional view of a certain part of the area. In order to generate a consistent view of the entire area, it is thus necessary to aggregate the different views from the different images together. This can be done based on knowledge of the locations at which the images were recorded. For instance, it will be appreciated that an object generally appears in any given image as a set of two-dimensional points. When a sequence of images is recorded, the same object may thus appear in multiple images but viewed from different perspectives. This is what allows the object to be localised, e.g. using triangulation. However, in order to generate a consistent local map representation, so that the object can be mapped from the images into a desired three-dimensional coordinate frame, e.g. to show the relative position and orientation of the object within the road network, the position and orientation (together, the “pose”) of the camera used to obtain the different images must be taken into account. That is, in order to generate a consistent local map representation from a sequence of images recorded at different positions, it is necessary to know, or determine, the camera locations for the different images.”); compare first one or more elements shown in the first map and second one or more elements shown in the second map, each respective element of the first one or more elements corresponding to a respective element of the second one or more elements (See at least paragraph [0120], “For instance, the local map representation may be aligned with a corresponding reference map section. Any features that are included in the local map representation, such as the landmark observations and road/lane geometries described above, may in general be used to perform this matching… preferably, the comparison comprises matching and/or aligning the positions of one or more features, e.g. landmarks and/or lane markings, of the local map representation with the positions of the corresponding features in the reference map section” and paragraph [0163], “The local map representation 14 of the scenery is then matched and aligned against the reference map section 16 (given that the reference map provides sufficient coverage) to determine various correspondences 22 between the local map representation 14 and the reference map section 16.”). Ganjineh does not explicitly disclose, however, Wheeler, in the same field of endeavor, teaches determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map (See at least paragraph [0053], “The map discrepancy module 290 works with the map update API 285 to determine map discrepancies and communicate map discrepancy information to the online HD map system 110. Determining map discrepancies involves comparing sensor data 230 of a particular location to HD map data for that particular location” and paragraph [0122], “The vehicle 150 transmits 1120 a discrepancy to the online HD map system 110 if it determines that the discrepancy is a significant discrepancy. The vehicle may send raw sensor data associated with the discrepancy to the online HD map system 110 along with the discrepancy. The vehicle 150 stores 1122 the updated occupancy map locally in the local HD map store 275. In some embodiments, the vehicle 150 transmits a discrepancy immediately if the associated significance value is greater than a threshold.” The system compares sensor observations of the environment with HD map data representing the same location in order to determine map discrepancies. The system therefore compares elements represented in the HD map, corresponding to the claimed first map, with elements represented in a locally generated occupancy map derived from sensor observations, corresponding to the claimed second map. The system evaluates the discrepancy and determines whether an associated significance value exceeds a threshold, thereby determining that a difference between the map representation exceeds the threshold.); determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference (See at least paragraph [0122], “The vehicle 150 transmits 1120 a discrepancy to the online HD map system 110 if it determines that the discrepancy is a significant discrepancy. The vehicle may send raw sensor data associated with the discrepancy to the online HD map system 110 along with the discrepancy. The vehicle 150 stores 1122 the updated occupancy map locally in the local HD map store 275. In some embodiments, the vehicle 150 transmits a discrepancy immediately if the associated significance value is greater than a threshold.” The system determines that when the discrepancy associated with the observed environment exceeds the threshold significance value, the locally generated occupancy map reflecting the sensed environment is stored locally in the vehicle system. The system therefore relies on the locally generated occupancy map derived from sensor observations, corresponding to the second map, when the discrepancy exceeds the threshold.); and perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle (See at least paragraph [0037], “The vehicle controls 130 control the physical movement of the vehicle, for example, acceleration, direction change, starting, stopping, and so on. The vehicle controls 130 include the machinery for controlling the accelerator, brakes, steering wheel, and so on. The vehicle computing system 120 continuously provides control signals to the vehicle controls 130, thereby causing an autonomous vehicle to drive along a selected route”, paragraph [0040], “FIG. 2 shows the system architecture of a vehicle computing system, according to an embodiment. The vehicle computing system 120 comprises a perception module 210, a prediction module 215, a planning module 220, a control module 225, a local HD map store 275, an HD map system interface 280, a map discrepancy module 290, and an HD map application programming interface (API) 205. The various modules of the vehicle computing system 120 process various type of data including sensor data 230, a behavior model 235, routes 240, and physical constraints 245. In other embodiments, the vehicle computing system 120 may have more or fewer modules. Functionality described as being implemented by a particular module may be implemented by other modules”, paragraph [0043], “The planning module 220 receives the information describing the surroundings of the vehicle from the prediction module 215, the route 240 that determines the destination of the vehicle, and the path that the vehicle should take to get to the destination. The planning module 220 uses the information from the prediction module 215 and the route 240 to plan a sequence of actions that the vehicle needs to take within a short time interval, for example, within the next few seconds. In an embodiment, the planning module 220 specifies the sequence of actions as one or more points representing nearby locations that the vehicle needs to drive through next. The planning module 220 provides the details of the plan comprising the sequence of actions to be taken by the vehicle to the control module 225. The plan may determine the subsequent action of the vehicle, for example, whether the vehicle performs a lane change, a turn, acceleration by increasing the speed or slowing down, and so on”, paragraph [0044], “The control module 225 determines the control signals for sending to the controls 130 of the vehicle based on the plan received from the planning module 220. For example, if the vehicle is currently at point A and the plan specifies that the vehicle should next go to a nearby point B, the control module 225 determines the control signals for the controls 130 that would cause the vehicle to go from point A to point B”, and paragraph [0122], “The vehicle 150 transmits 1120 a discrepancy to the online HD map system 110 if it determines that the discrepancy is a significant discrepancy. The vehicle may send raw sensor data associated with the discrepancy to the online HD map system 110 along with the discrepancy. The vehicle 150 stores 1122 the updated occupancy map locally in the local HD map store 275. In some embodiments, the vehicle 150 transmits a discrepancy immediately if the associated significance value is greater than a threshold.” The system includes a vehicle computing system having a planning module and a control module that process sensor and map data. The system plans navigation actions such as lane changes, turns, acceleration, or slowing down and determines control signals transmitted to vehicle controls including steering, throttle, and braking systems to cause the vehicle to move according to the planned navigation actions.). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, and perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). With respect to claim 16, please see the rejection above with respect to claim 1, which is commensurate in scope to claim 16, with claim 1 being drawn to a mapping system and claim 16 being drawn to a corresponding method. With respect to claim 31, please see the rejection above with respect to claim 1, which is commensurate in scope to claim 31, with claim 1 being drawn to a mapping system and claim 31 being drawn to a corresponding non-transitory computer-readable medium. Regarding Claim 4, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh teaches wherein the at least one processor is configured to: determine at least one correction for the first map based on determining the at least one difference is greater than the threshold difference (See at least paragraph [0120], “For instance, the local map representation may be aligned with a corresponding reference map section. Any features that are included in the local map representation, such as the landmark observations and road/lane geometries described above, may in general be used to perform this matching… preferably, the comparison comprises matching and/or aligning the positions of one or more features, e.g. landmarks and/or lane markings, of the local map representation with the positions of the corresponding features in the reference map section”, paragraph [0121], “the matching may be used for the purposes of map generation and/or updating a map. For instance, when the matching indicates that the reference map section is out of date or contains one or more errors, e.g. where the local map representation contains one or more features that are not present in the reference map section, or where the local map representation shows that one or more features have changed, or are no longer present in the environment of the road network, the reference map may be updated accordingly”, and paragraph [0163], “The local map representation 14 of the scenery is then matched and aligned against the reference map section 16 (given that the reference map provides sufficient coverage) to determine various correspondences 22 between the local map representation 14 and the reference map section 16.”). With respect to claim 19, please see the rejection above with respect to claim 4, which is commensurate in scope to claim 19, with claim 4 being drawn to a mapping system and claim 19 being drawn to a corresponding method. With respect to claim 33, please see the rejection above with respect to claim 4, which is commensurate in scope to claim 33, with claim 4 being drawn to a mapping system and claim 33 being drawn to a corresponding non-transitory computer-readable medium. Regarding Claim 7, Ganjineh and Wheeler teach The apparatus of claim 4, as set forth in the obviousness rejection above. Ganjineh teaches wherein the at least one processor is configured to: apply the at least one correction to the first map (See at least paragraph [0120], “For instance, the local map representation may be aligned with a corresponding reference map section. Any features that are included in the local map representation, such as the landmark observations and road/lane geometries described above, may in general be used to perform this matching… preferably, the comparison comprises matching and/or aligning the positions of one or more features, e.g. landmarks and/or lane markings, of the local map representation with the positions of the corresponding features in the reference map section”, paragraph [0121], “the matching may be used for the purposes of map generation and/or updating a map. For instance, when the matching indicates that the reference map section is out of date or contains one or more errors, e.g. where the local map representation contains one or more features that are not present in the reference map section, or where the local map representation shows that one or more features have changed, or are no longer present in the environment of the road network, the reference map may be updated accordingly”, and paragraph [0163], “The local map representation 14 of the scenery is then matched and aligned against the reference map section 16 (given that the reference map provides sufficient coverage) to determine various correspondences 22 between the local map representation 14 and the reference map section 16.”). With respect to claim 22, please see the rejection above with respect to claim 7, which is commensurate in scope to claim 22, with claim 7 being drawn to a mapping system and claim 22 being drawn to a corresponding method. With respect to claim 36, please see the rejection above with respect to claim 7, which is commensurate in scope to claim 36, with claim 7 being drawn to a mapping system and claim 36 being drawn to a corresponding non-transitory computer-readable medium. Regarding Claim 10, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh teaches wherein the first map is a high definition (HD) map (See at least paragraph [0045], “The road network is generally a network comprising a plurality of interconnected roads that are navigable by a vehicle. The road network may generally be represented by a digital, or electronic, map (or mathematical graph)…This type of electronic map may be referred to as a “HD” map (compared to a conventional “SD” map containing the road centrelines but not the lane centrelines). The additional information contained in the HD map, and at least the lane markings, is generally required for the purposes of autonomous driving”, paragraph [0122], “The road network is generally a network comprising a plurality of interconnected roads that are navigable by a vehicle. The road network may generally be represented by a digital, or electronic, map (or mathematical graph)”, and paragraph [0159], “A reference map section 16 is extracted from the previously compiled internal map repository 18.”). With respect to claim 25, please see the rejection above with respect to claim 10, which is commensurate in scope to claim 25, with claim 10 being drawn to a mapping system and claim 25 being drawn to a corresponding method. Regarding Claim 11, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh teaches further comprising the one or more sensors (See at least paragraph [0179], “FIG. 2 shows an autonomous vehicle system 200 equipped with a camera sensor input, and a high-bandwidth wireless internet connection to a cloud-computing environment 206. Thus, the autonomous vehicle is equipped with a monocular or stereo camera 202; a high-end on-board processing unit; and a high-bandwidth high-latency mobile data connection, or W-LAN 204. The system receives live recorded images from the on-board cameras 202, as well as coarse GPS coordinates from the on-board odometry. The localisation result 208 is handed over to autonomous driving logic. The map building result resides within a cloud-based repository 206.”). Regarding Claim 13, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh does not explicitly disclose, however, Wheeler, in the same field of endeavor, teaches wherein the apparatus is part of the vehicle (See at least paragraph [0040], “FIG. 2 shows the system architecture of a vehicle computing system, according to an embodiment. The vehicle computing system 120 comprises a perception module 210, a prediction module 215, a planning module 220, a control module 225, a local HD map store 275, an HD map system interface 280, a map discrepancy module 290, and an HD map application programming interface (API) 205. The various modules of the vehicle computing system 120 process various type of data including sensor data 230, a behavior model 235, routes 240, and physical constraints 245. In other embodiments, the vehicle computing system 120 may have more or fewer modules. Functionality described as being implemented by a particular module may be implemented by other modules.”). With respect to claim 28, please see the rejection above with respect to claim 13, which is commensurate in scope to claim 28, with claim 13 being drawn to a mapping system and claim 28 being drawn to a corresponding method. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein the apparatus is part of the vehicle, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). Regarding Claim 14, Ganjineh and Wheeler teach The apparatus of claim 13, as set forth in the obviousness rejection above. Ganjineh does not explicitly disclose, however, Wheeler, in the same field of endeavor, teaches wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map (See at least paragraph [0040], “FIG. 2 shows the system architecture of a vehicle computing system, according to an embodiment. The vehicle computing system 120 comprises a perception module 210, a prediction module 215, a planning module 220, a control module 225, a local HD map store 275, an HD map system interface 280, a map discrepancy module 290, and an HD map application programming interface (API) 205. The various modules of the vehicle computing system 120 process various type of data including sensor data 230, a behavior model 235, routes 240, and physical constraints 245. In other embodiments, the vehicle computing system 120 may have more or fewer modules. Functionality described as being implemented by a particular module may be implemented by other modules”, paragraph [0043], “The planning module 220 receives the information describing the surroundings of the vehicle from the prediction module 215, the route 240 that determines the destination of the vehicle, and the path that the vehicle should take to get to the destination. The planning module 220 uses the information from the prediction module 215 and the route 240 to plan a sequence of actions that the vehicle needs to take within a short time interval, for example, within the next few seconds. In an embodiment, the planning module 220 specifies the sequence of actions as one or more points representing nearby locations that the vehicle needs to drive through next. The planning module 220 provides the details of the plan comprising the sequence of actions to be taken by the vehicle to the control module 225. The plan may determine the subsequent action of the vehicle, for example, whether the vehicle performs a lane change, a turn, acceleration by increasing the speed or slowing down, and so on”, and paragraph [0044], “The control module 225 determines the control signals for sending to the controls 130 of the vehicle based on the plan received from the planning module 220. For example, if the vehicle is currently at point A and the plan specifies that the vehicle should next go to a nearby point B, the control module 225 determines the control signals for the controls 130 that would cause the vehicle to go from point A to point B.” The system includes a planning module that determines a sequence of navigation actions for the vehicle such as lane changes, turns, acceleration, or slowing down. The system further includes a control module that determines control signals transmitted to the vehicle controls to cause the vehicle to perform the planned actions.). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). With respect to claim 29, please see the rejection above with respect to claim 14, which is commensurate in scope to claim 29, with claim 14 being drawn to a mapping system and claim 29 being drawn to a corresponding method. Regarding Claim 15, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh does not explicitly disclose, however, Wheeler, in the same field of endeavor, teaches wherein the at least one navigation function includes at least one of an Advanced Driver Assistance Systems (ADAS) function, an autonomous driving decision, a lane change function (See at least paragraph [0043], “The planning module 220 receives the information describing the surroundings of the vehicle from the prediction module 215, the route 240 that determines the destination of the vehicle, and the path that the vehicle should take to get to the destination. The planning module 220 uses the information from the prediction module 215 and the route 240 to plan a sequence of actions that the vehicle needs to take within a short time interval, for example, within the next few seconds. In an embodiment, the planning module 220 specifies the sequence of actions as one or more points representing nearby locations that the vehicle needs to drive through next. The planning module 220 provides the details of the plan comprising the sequence of actions to be taken by the vehicle to the control module 225. The plan may determine the subsequent action of the vehicle, for example, whether the vehicle performs a lane change, a turn, acceleration by increasing the speed or slowing down, and so on.”), a vehicle overtake function, a stop function, a speed reduction function, or a velocity increase function. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). With respect to claim 30, please see the rejection above with respect to claim 15, which is commensurate in scope to claim 30, with claim 15 being drawn to a mapping system and claim 30 being drawn to a corresponding method. Claim(s) 5, 20, and 34 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ganjineh (US 20200098135 A1) in view of Wheeler (US 20180188045 A1) and Stenneth (US 20220397419 A1). Regarding Claim 5, Ganjineh and Wheeler teach The apparatus of claim 4, as set forth in the obviousness rejection above. Ganjineht teaches wherein the at least one processor is configured to: cause (See at least paragraph [0027], “When such errors are identified, the reference map may be updated accordingly. For instance, when such errors are identified, the local map representation may then be provided to a remote server in order to update the reference map, e.g. by updating the, or generating a new, reference map section to be incorporated into the reference map based on the local map representation”, paragraph [0120], “For instance, the local map representation may be aligned with a corresponding reference map section. Any features that are included in the local map representation, such as the landmark observations and road/lane geometries described above, may in general be used to perform this matching… preferably, the comparison comprises matching and/or aligning the positions of one or more features, e.g. landmarks and/or lane markings, of the local map representation with the positions of the corresponding features in the reference map section”, paragraph [0121], “the matching may be used for the purposes of map generation and/or updating a map. For instance, when the matching indicates that the reference map section is out of date or contains one or more errors, e.g. where the local map representation contains one or more features that are not present in the reference map section, or where the local map representation shows that one or more features have changed, or are no longer present in the environment of the road network, the reference map may be updated accordingly”, and paragraph [0163], “The local map representation 14 of the scenery is then matched and aligned against the reference map section 16 (given that the reference map provides sufficient coverage) to determine various correspondences 22 between the local map representation 14 and the reference map section 16.”). Ganjineh and Wheeler do not explicitly disclose, however, Stenneth, in the same field of endeavor, teaches at least one transceiver (See at least paragraph [0127], “For wireless links, the communications interface 1070 may send and/or receive electrical, acoustic or electromagnetic signals, including infrared and optical signals, that carry information streams, including digital data. For example, in wireless handheld devices (e.g., mobile telephones, cell phones, and so forth), the communications interface 1070 may include a radio band electromagnetic transmitter and receiver called a radio transceiver. In certain embodiments, the communications interface 1070 enables connection to the communication network, as described with reference to FIG. 1A.”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler and Stenneth such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), and utilize a transceiver for communicating, as taught by Stenneth (See paragraph [0127].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). The motivation for doing so would be increased map accuracy and selection, as taught by Stenneth (See paragraph [0002].). With respect to claim 20, please see the rejection above with respect to claim 5, which is commensurate in scope to claim 20, with claim 5 being drawn to a mapping system and claim 20 being drawn to a corresponding method. With respect to claim 34, please see the rejection above with respect to claim 5, which is commensurate in scope to claim 34, with claim 5 being drawn to a mapping system and claim 34 being drawn to a corresponding non-transitory computer-readable medium. Claim(s) 6, 8, 21, 23, 35, and 37 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ganjineh (US 20200098135 A1) in view of Wheeler (US 20180188045 A1) and LIU (CN 114490675 A). Regarding Claim 6, Ganjineh and Wheeler teach The apparatus of claim 4, as set forth in the obviousness rejection above. Ganjineh and Wheeler do not explicitly disclose, however, LIU, in the same field of endeavor, teaches wherein the apparatus is a first vehicle or is part of the first vehicle, and wherein the at least one processor is configured to: cause at least one transceiver to transmit information associated with the at least one correction to a second vehicle (See at least paragraph [n0035], “communication device is provided, which is the map updating device on the terminal device side (such as the first map updating device on the first terminal device side) or the map updating device on the server side (such as the second map updating device)…it also includes a transceiver, the memory is used to store computer programs or instructions, and the processor is used to call and run the computer program or instructions from the memory. When the processor executes the computer program or instructions in the memory, the communication device executes any implementation of any communication method of the first to third aspects above”, paragraph [n0060], “The map updating device may include a map updating device of a positioning system in the vehicle, a map updating device of an intelligent driving system, or any other device with computing capability. For ease of reference, in the embodiment of the present application, the map updating device at the first terminal device end is referred to as the first map updating device, the map updating device at the second terminal device end is referred to as the third map updating device, and the map updating device at the third terminal device end is referred to as the fourth map updating device”, paragraph [n0061], “In the embodiments of the present application, the vehicle can communicate with other objects based on vehicle-to-external wireless communication technology (for example, vehicle to everything (V2X)). For example, the communication between the vehicle and the second map updating device may be implemented based on inter-vehicle wireless communication technology (eg, vehicle to vehicle (V2V))”, and paragraph [n0164], “In the above step 205, the first map updating apparatus may report at least one item of content included in the first location information or the first error information in step 203 to the second map updating apparatus in one signaling.”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler and LIU such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), and to utilize a transceiver to transmit information associated with the correction to a second vehicle, as taught by LIU (See paragraphs [n0061], [n0164].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). The motivation for doing so would be increased map precision communicated to multiple vehicles, as taught by LIU (See paragraph [n0002].). With respect to claim 21, please see the rejection above with respect to claim 6, which is commensurate in scope to claim 21, with claim 6 being drawn to a mapping system and claim 21 being drawn to a corresponding method. With respect to claim 35, please see the rejection above with respect to claim 6, which is commensurate in scope to claim 35, with claim 6 being drawn to a mapping system and claim 35 being drawn to a corresponding non-transitory computer-readable medium. Regarding Claim 8, Ganjineh and Wheeler teach The apparatus of claim 4, as set forth in the obviousness rejection above. Ganjineh and Wheeler do not explicitly disclose, however, LIU, in the same field of endeavor, teaches wherein the at least one correction includes at least a first correction and a second correction, and wherein the at least one processor is configured to: apply the first correction to a first element of the first map (See at least paragraph [n0226], “the processor 1302 is specifically used to: determine a first weight based on the first error information, the first weight being used to indicate the degree of correction of the current position information of the map element by the first position information; and update the current position information of the map element on the map based on the first position information and the first weight.”); and apply the first correction to a second element of the first map (See at least paragraph [n0227], “updating the map according to the first error information and the first position information, is further used to: obtain second error information of the second vehicle and second position information of the map element collected by the second vehicle, the second error information including error information of the second vehicle positioned by a positioning system of the second vehicle; determine a second weight according to the first error information and the second error information, the second weight being used to indicate a degree of correction of the current position information of the map element by the second position information.”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler and LIU such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), and to apply the first correction to a first element of the first map and apply the first correction to a second element of the first map, as taught by LIU (See paragraphs [n0226], [n0227].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). The motivation for doing so would be increased map precision communicated to multiple vehicles, as taught by LIU (See paragraph [n0002].). With respect to claim 23, please see the rejection above with respect to claim 8, which is commensurate in scope to claim 23, with claim 8 being drawn to a mapping system and claim 23 being drawn to a corresponding method. With respect to claim 37, please see the rejection above with respect to claim 8, which is commensurate in scope to claim 37, with claim 8 being drawn to a mapping system and claim 37 being drawn to a corresponding non-transitory computer-readable medium. Claim(s) 3, 18, and 32 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ganjineh (US 20200098135 A1) in view of Wheeler (US 20180188045 A1) and SHALEV-SHWARTZ (US 20230166729 A1). Regarding Claim 3, Ganjineh and Wheeler teach The apparatus of claim 1, as set forth in the obviousness rejection above. Ganjineh and Wheeler do not explicitly disclose, however, SHALEV-SHWARTZ, in the same field of endeavor, teaches wherein the at least one processor is configured to: generate a third map of the environment based on additional sensor data from the one or more sensors; compare third one or more elements of the first map and fourth one or more elements of the third map, each respective element of the third one or more elements corresponding to a respective element of the fourth one or more elements; determine at least one difference between the third one or more elements of the first map and the fourth one or more elements of the third map is less than the threshold difference; and determine to use the first map for at least one additional navigation function based on the at least one difference between the third one or more elements and the fourth one or more elements being less than the threshold difference (See at least paragraph [0154], “In a three camera system, a first processing device may receive images from both the main camera and the narrow field of view camera, and perform vision processing of the narrow FOV camera to, for example, detect other vehicles, pedestrians, lane marks, traffic signs, traffic lights, and other road objects. Further, the first processing device may calculate a disparity of pixels between the images from the main camera and the narrow camera and create a 3D reconstruction of the environment of vehicle 200. The first processing device may then combine the 3D reconstruction with 3D map data or with 3D information calculated based on information from another camera”, paragraph [0166], “In one embodiment, navigational response module 408 may store software executable by processing unit 110 to determine a desired navigational response based on data derived from execution of monocular image analysis module 402 and/or stereo image analysis module 404. Such data may include position and speed information associated with nearby vehicles, pedestrians, and road objects, target position information for vehicle 200, and the like. Additionally, in some embodiments, the navigational response may be based (partially or fully) on map data, a predetermined position of vehicle 200, and/or a relative velocity or a relative acceleration between vehicle 200 and one or more objects detected from execution of monocular image analysis module 402 and/or stereo image analysis module 404. Navigational response module 408 may also determine a desired navigational response based on sensory input (e.g., information from radar) and inputs from other systems of vehicle 200, such as throttling system 220, braking system 230, and steering system 240 of vehicle 200. Based on the desired navigational response, processing unit 110 may transmit electronic signals to throttling system 220, braking system 230, and steering system 240 of vehicle 200 to trigger a desired navigational response by, for example, turning the steering wheel of vehicle 200 to achieve a rotation of a predetermined angle. In some embodiments, processing unit 110 may use the output of navigational response module 408 (e.g., the desired navigational response) as an input to execution of velocity and acceleration module 406 for calculating a change in speed of vehicle 200”, paragraph [0171], “FIG. 5B is a flowchart showing an exemplary process 500B for detecting one or more vehicles and/or pedestrians in a set of images, consistent with disclosed embodiments. Processing unit 110 may execute monocular image analysis module 402 to implement process 500B. At step 540, processing unit 110 may determine a set of candidate objects representing possible vehicles and/or pedestrians. For example, processing unit 110 may scan one or more images, compare the images to one or more predetermined patterns, and identify within each image possible locations that may contain objects of interest (e.g., vehicles, pedestrians, or portions thereof). The predetermined patterns may be designed in such a way to achieve a high rate of “false hits” and a low rate of “misses.” For example, processing unit 110 may use a low threshold of similarity to predetermined patterns for identifying candidate objects as possible vehicles or pedestrians. Doing so may allow processing unit 110 to reduce the probability of missing (e.g., not identifying) a candidate object representing a vehicle or pedestrian”, and paragraph [0192], “At step 620, processing unit 110 may execute stereo image analysis module 404 to perform stereo image analysis of the first and second plurality of images to create a 3D map of the road in front of the vehicle and detect features within the images, such as lane markings, vehicles, pedestrians, road signs, highway exit ramps, traffic lights, road hazards, and the like. Stereo image analysis may be performed in a manner similar to the steps described in connection with FIGS. 5A-5D, above. For example, processing unit 110 may execute stereo image analysis module 404 to detect candidate objects (e.g., vehicles, pedestrians, road marks, traffic lights, road hazards, etc.) within the first and second plurality of images, filter out a subset of the candidate objects based on various criteria, and perform multi-frame analysis, construct measurements, and determine a confidence level for the remaining candidate objects. In performing the steps above, processing unit 110 may consider information from both the first and second plurality of images, rather than information from one set of images alone. For example, processing unit 110 may analyze the differences in pixel-level data (or other data subsets from among the two streams of captured images) for a candidate object appearing in both the first and second plurality of images. As another example, processing unit 110 may estimate a position and/or velocity of a candidate object (e.g., relative to vehicle 200) by observing that the object appears in one of the plurality of images but not the other or relative to other differences that may exist relative to objects appearing in the two image streams. For example, position, velocity, and/or acceleration relative to vehicle 200 may be determined based on trajectories, positions, movement characteristics, etc. of features associated with an object appearing in one or both of the image streams.”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler and SHALEV-SHWARTZ such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), and to generate a third map of the environment based on additional sensor data from the one or more sensors, compare third one or more elements of the first map and fourth one or more elements of the third map, each respective element of the third one or more elements corresponding to a respective element of the fourth one or more elements, determine at least one difference between the third one or more elements of the first map and the fourth one or more elements of the third map is less than the threshold difference, and determine to use the first map for at least one additional navigation function based on the at least one difference between the third one or more elements and the fourth one or more elements being less than the threshold difference, as taught by SHALEV-SHWARTZ (See paragraphs [0154], [0166], [0171], [0192].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). The motivation for doing so would be increased safety assurance and navigation accuracy, as taught by SHALEV-SHWARTZ (See paragraph [0003], [0004].). With respect to claim 18, please see the rejection above with respect to claim 3, which is commensurate in scope to claim 18, with claim 3 being drawn to a mapping system and claim 18 being drawn to a corresponding method. Regarding Claim 32, Ganjineh and Wheeler teach The non-transitory computer-readable medium of claim 31, as set forth in the obviousness rejection above. Ganjineh and Wheeler do not explicitly disclose, however, SHALEV-SHWARTZ, in the same field of endeavor, teaches wherein the instructions, when executed by the at least one processor, cause the at least one processor to: generate a third map of the environment based on additional sensor data from the one or more sensors; compare third one or more elements of the first map and fourth one or more elements of the third map, each respective element of the third one or more elements corresponding to a respective element of the fourth one or more elements; determine at least one difference between the third one or more elements of the first map and the fourth one or more elements of the third map is less than the threshold difference; and determine to use the first map for at least one additional navigation function based on the at least one difference between the third one or more elements and the fourth one or more elements being less than the threshold difference (See at least paragraph [0154], “In a three camera system, a first processing device may receive images from both the main camera and the narrow field of view camera, and perform vision processing of the narrow FOV camera to, for example, detect other vehicles, pedestrians, lane marks, traffic signs, traffic lights, and other road objects. Further, the first processing device may calculate a disparity of pixels between the images from the main camera and the narrow camera and create a 3D reconstruction of the environment of vehicle 200. The first processing device may then combine the 3D reconstruction with 3D map data or with 3D information calculated based on information from another camera”, paragraph [0166], “In one embodiment, navigational response module 408 may store software executable by processing unit 110 to determine a desired navigational response based on data derived from execution of monocular image analysis module 402 and/or stereo image analysis module 404. Such data may include position and speed information associated with nearby vehicles, pedestrians, and road objects, target position information for vehicle 200, and the like. Additionally, in some embodiments, the navigational response may be based (partially or fully) on map data, a predetermined position of vehicle 200, and/or a relative velocity or a relative acceleration between vehicle 200 and one or more objects detected from execution of monocular image analysis module 402 and/or stereo image analysis module 404. Navigational response module 408 may also determine a desired navigational response based on sensory input (e.g., information from radar) and inputs from other systems of vehicle 200, such as throttling system 220, braking system 230, and steering system 240 of vehicle 200. Based on the desired navigational response, processing unit 110 may transmit electronic signals to throttling system 220, braking system 230, and steering system 240 of vehicle 200 to trigger a desired navigational response by, for example, turning the steering wheel of vehicle 200 to achieve a rotation of a predetermined angle. In some embodiments, processing unit 110 may use the output of navigational response module 408 (e.g., the desired navigational response) as an input to execution of velocity and acceleration module 406 for calculating a change in speed of vehicle 200”, paragraph [0171], “FIG. 5B is a flowchart showing an exemplary process 500B for detecting one or more vehicles and/or pedestrians in a set of images, consistent with disclosed embodiments. Processing unit 110 may execute monocular image analysis module 402 to implement process 500B. At step 540, processing unit 110 may determine a set of candidate objects representing possible vehicles and/or pedestrians. For example, processing unit 110 may scan one or more images, compare the images to one or more predetermined patterns, and identify within each image possible locations that may contain objects of interest (e.g., vehicles, pedestrians, or portions thereof). The predetermined patterns may be designed in such a way to achieve a high rate of “false hits” and a low rate of “misses.” For example, processing unit 110 may use a low threshold of similarity to predetermined patterns for identifying candidate objects as possible vehicles or pedestrians. Doing so may allow processing unit 110 to reduce the probability of missing (e.g., not identifying) a candidate object representing a vehicle or pedestrian”, and paragraph [0192], “At step 620, processing unit 110 may execute stereo image analysis module 404 to perform stereo image analysis of the first and second plurality of images to create a 3D map of the road in front of the vehicle and detect features within the images, such as lane markings, vehicles, pedestrians, road signs, highway exit ramps, traffic lights, road hazards, and the like. Stereo image analysis may be performed in a manner similar to the steps described in connection with FIGS. 5A-5D, above. For example, processing unit 110 may execute stereo image analysis module 404 to detect candidate objects (e.g., vehicles, pedestrians, road marks, traffic lights, road hazards, etc.) within the first and second plurality of images, filter out a subset of the candidate objects based on various criteria, and perform multi-frame analysis, construct measurements, and determine a confidence level for the remaining candidate objects. In performing the steps above, processing unit 110 may consider information from both the first and second plurality of images, rather than information from one set of images alone. For example, processing unit 110 may analyze the differences in pixel-level data (or other data subsets from among the two streams of captured images) for a candidate object appearing in both the first and second plurality of images. As another example, processing unit 110 may estimate a position and/or velocity of a candidate object (e.g., relative to vehicle 200) by observing that the object appears in one of the plurality of images but not the other or relative to other differences that may exist relative to objects appearing in the two image streams. For example, position, velocity, and/or acceleration relative to vehicle 200 may be determined based on trajectories, positions, movement characteristics, etc. of features associated with an object appearing in one or both of the image streams.”). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date to combine the invention of Ganjineh with the teachings of Wheeler and SHALEV-SHWARTZ such that the map system of Ganjineh is further configured to determine at least one difference between the first one or more elements shown in the first map and the second one or more elements shown in the second map is greater than a threshold difference based on comparing the first one or more elements shown in the first map with the second one or more elements shown in the second map, determine to use the second map for at least one navigation function based on determining the at least one difference is greater than the threshold difference, perform the at least one navigation function using the second map, wherein the at least one navigation function controls a movement of a vehicle, and wherein, to perform the at least one navigation function, the at least one processor is configured to: cause the vehicle to perform the at least one navigation function using the second map, as taught by Wheeler (See paragraphs [0037], [0040], [0043], [0044], [0053], [0122].), and to generate a third map of the environment based on additional sensor data from the one or more sensors, compare third one or more elements of the first map and fourth one or more elements of the third map, each respective element of the third one or more elements corresponding to a respective element of the fourth one or more elements, determine at least one difference between the third one or more elements of the first map and the fourth one or more elements of the third map is less than the threshold difference, and determine to use the first map for at least one additional navigation function based on the at least one difference between the third one or more elements and the fourth one or more elements being less than the threshold difference, as taught by SHALEV-SHWARTZ (See paragraphs [0154], [0166], [0171], [0192].), with a reasonable expectation of success. The motivation for doing so would be increased navigation safety by providing low latency, high accuracy, data freshness, and store efficiency, as taught by Wheeler (See paragraph [0029].). The motivation for doing so would be increased safety assurance and navigation accuracy, as taught by SHALEV-SHWARTZ (See paragraph [0003], [0004].). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEWEL ASHLEY KUNTZ whose telephone number is (571)270-5542. The examiner can normally be reached M-F 8:30am-5:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEWEL A KUNTZ/Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

May 02, 2023
Application Filed
Mar 22, 2025
Non-Final Rejection — §103
Jun 06, 2025
Interview Requested
Jun 16, 2025
Applicant Interview (Telephonic)
Jun 16, 2025
Examiner Interview Summary
Jun 26, 2025
Response Filed
Oct 21, 2025
Final Rejection — §103
Dec 17, 2025
Response after Non-Final Action
Jan 07, 2026
Applicant Interview (Telephonic)
Jan 07, 2026
Examiner Interview Summary
Jan 20, 2026
Request for Continued Examination
Feb 26, 2026
Response after Non-Final Action
Mar 04, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578195
INFORMATION PROCESSING SYSTEM AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12565204
VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 03, 2026
Patent 12542012
TEST SYSTEM, CONTROL DEVICE, TEST METHOD, AND TEST SYSTEM PROGRAM
2y 5m to grant Granted Feb 03, 2026
Patent 12523490
Systems and Methods for Vehicle Navigation
2y 5m to grant Granted Jan 13, 2026
Patent 12518631
Vehicle Scheduling Method, Electronic Equipment and Storage Medium
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
80%
With Interview (+7.9%)
2y 12m
Median Time to Grant
High
PTA Risk
Based on 68 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month