Prosecution Insights
Last updated: April 19, 2026
Application No. 18/219,605

SYSTEMS AND METHODS FOR AUTONOMOUS DRIVING USING TRACKING TAGS

Final Rejection §103§112
Filed
Jul 07, 2023
Examiner
IVEY, DANA DESHAWN
Art Unit
3662
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Torc Robotics, Inc.
OA Round
2 (Final)
90%
Grant Probability
Favorable
3-4
OA Rounds
2y 2m
To Grant
97%
With Interview

Examiner Intelligence

Grants 90% — above average
90%
Career Allow Rate
683 granted / 762 resolved
+37.6% vs TC avg
Moderate +7% lift
Without
With
+7.3%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 2m
Avg Prosecution
44 currently pending
Career history
806
Total Applications
across all art units

Statute-Specific Performance

§101
2.3%
-37.7% vs TC avg
§103
27.9%
-12.1% vs TC avg
§102
42.1%
+2.1% vs TC avg
§112
21.9%
-18.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 762 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This final action is in response to Applicant’s filing dated July 2, 2025. Claims 1-20 are currently pending and have been considered, as provided in more detail below. Claims 1, 7-8 and 10-20 have been amended. *Examiner Note: Claim language is bolded. Cited References and Applicant’s arguments are italicized. Examiner interpretations are preceded with an asterisk *. Response to Arguments Applicant's arguments filed 7/2/25 have been fully considered but they are not persuasive and considered moot because the arguments are directed to the previously presented claims before the amendment. Applicant’s amendment of the independent claims to include new matter, has necessitated new grounds of rejection. Regarding Applicant’s position on page 10 of the remarks that cited references Yoon, Bogatine, Xu, or McCarthy don’t “describe or suggest at least "detect a unique numerical identification of a tracking tag embedded underneath the surface of the road based on data collected from the sensor during the monitoring of the surface, the unique numerical identification being specific to the tracking tag and to a designated portion of the road in which the tracking tag is located,""query a database using the unique numerical identification of the tracking tag, the database comprising navigational information mapped to different unique numerical identifications of tracking tags,""detect, via the sensor, one or more changes to one or more subsequent tracking tags on the road while the autonomous vehicle is operating according to the navigational action, the one or more subsequent tracking tags being specific to different designated portions of the road," and "cause the database to be updated by mapping data associated with the detected one or more changes to the unique numerical identifications corresponding to the subsequent tracking tags," as recited in amended independent claim 1”, the Examiner respectfully does not agree because: The base reference of Yoon is being relied upon to show a unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification); being specific to the tracking tag and to a designated portion of the road in which the tracking tag is located (see at least the abstract of Yoon which discloses “data obtained from at least one of the RFID tags to determine vehicle location along the transportation surface”, *Examiner interprets the portion of the transportation surface at which the vehicle is located to be a designated portion of the road and see at least para. [0013] of Yoon which discloses “a vehicle 206 as it moves down the roadway, the response 209 from the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane, but the response 209 can also be used to identify which lane the vehicle 206 is in. Due to the RF frequency used in RFID systems, road hazards such as water or snow on the roadway do not interfere with the operation, allowing autonomous vehicles to navigate roads even when the optical cameras cannot find the roadway. The GatorEye system can be implemented for use on highways, as well as city roadways, local driveways, and in parking lots to assist with smart parking (informing parking lot occupancy, vacancy, etc.)”); detection of abnormalities in the tracking tags on the road which can be compared to one or more changes to one or more subsequent tracking tags on the road (see at least para. [0027] of Yoon which discloses “It can be relatively easy to detect abnormalities without causing unsafe operation of the vehicle 206 and replacement of the device would be simple. Detection of the magnetic paint would similarly be easy”, *Examiner interprets this an example of how potential changes in the tracking tags located near magnetic paint can be detected), one or more changes to one or more subsequent tracking tags on the road while the autonomous vehicle is operating according to the navigational action (see at least para. [0011] of Yoon which teaches “the disclosed system … can take the lead role in vehicle navigation and complement the detection and sensing roles of other existing systems”, *Examiner interprets the fact that the system takes a role in navigation to mean the system will determine a navigational action and see at least para. [0028] which discloses “autonomous vehicle operation, allowing the alternative systems to act as supporting roles for detecting extraneous road hazards”, *Examiner interprets vehicle operation to be navigational action), the one or more subsequent tracking tags being specific to different designated portions of the road (see at least para. [0027] of Yoon which discloses “units 100 can be closely spaced along both sides of the lane”, *Examiner interprets the positions at which units 100 are spaced along the lane to be equivalent to tags being placed at different designated portions of the road). Bogatine is relied upon to show a tracking tag (Fig. 2, 21 of Bogatine and see para. [0037] of Bogatine which describes “RFID tags 21 has been installed”) embedded underneath the surface of the road (see at least para. [0030] of Bogatine which discloses “RFID tags are located at some depth below the road surface”, *Examiner interprets a location at a depth below the road surface to be embedded underneath the surface of the road). Finally, Xu is evidence of the database comprising navigational information mapped (see at least para. [0023] of Xu which discloses “The map matching process associates the vehicle's GPS position (latitude, longitude, altitude), and optionally the heading and speed information, to a road segment” and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information”, *Examiner interprets this associated feature as evidence that the information is mapped to unique identifications that may be numerical since they reference latitude and longitude) to different unique numerical identifications of tracking tags (see at least para. [0006] of Xu which discloses “querying a database using the identifier of the at least one RFID tag to collect information about a location of the at least one RFID tag; determining a bias of a vehicle relative to the location of the scanned at least one RFID tag; and calculating a location of the vehicle based on the bias and the location of the at least one RFID tag”, *Examiner interprets information about a location of the RFID tag to be navigational information which will correspond to different latitude, longitude and height numbers, as discussed above); and cause the database to be updated (see at least para. [0004] of Xu which discloses “navigational aids can help an autonomous vehicle identify its location in line of sight situations which allows its position to be updated on a map using map matching techniques”) by mapping data (see at least para. [0023] of Xu which discloses “The map matching process associates the vehicle's GPS position (latitude, longitude, altitude), and optionally the heading and speed information, to a road segment” and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information”, *Examiner interprets this associated feature as evidence that the information is mapped to unique identifications that may be numerical in the fact that they reference latitude and longitude). Nevertheless, it should be noted that Applicant’s additional amendments have necessitated further and additional new grounds of rejection. While the new ground of rejection may rely on some of the previous references applied in the prior rejection of record, a new additional reference has been added to the combination and introduced for Applicant’s consideration given the amended independent claims as discussed in detail below. Response to Amendment Regarding the rejections under 35 USC §103, the amendments made to the claims have necessitated new grounds of rejections as outlined below. Claim Objections Claim 12 is objected to because of the following informalities: The phrase “by the more one processors” appears to contain a typographical error. Perhaps, Applicant intended to use the phrase – by the one or more processors – instead. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-20, are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claims contain subject matter (detect, via the sensor, one or more changes to one or more subsequent tracking tags; detecting, by the more one processors via the sensor, one or more changes to one or more subsequent tracking tags; and detect, via the sensor, one or more changes to one or more subsequent markings) which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. The newly added limitation set forth in claims 1, 12 and 17 detecting, by the more one processors via the sensor, one or more changes to one or more subsequent tracking tags is new matter because the specification contains no discussion or support for detecting changes to one or more subsequent tracking tags. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. In claims 1, 12 and 17, as amended, the recitation of “detect, via the sensor, one or more changes to one or more subsequent tracking tags”; “detecting, by the more one processors via the sensor, one or more changes to one or more subsequent tracking tags”; detect, via the sensor, one or more changes to one or more subsequent markings on the road while the autonomous vehicle is operating according to the navigational action, the one or more subsequent markings being specific to different designated portions of the road; respectively is confusing, vague and unclear because Applicant’s specification does not explain how detection of changes to tracking tags or markings will occur. Applicant fails to identify how the detection happens and how is measured by the sensor. This recitation is indefinite. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 6-16 are rejected under 35 U.S.C. 103 as being unpatentable over Yoon et al. (US 2018/0165526 A1) in view of Bogatine (US 2012/0098657 A1) and further in view of Xu et al. (US 2017/0074964 A1) and further in view of Fraser (CA2845230A1). Regarding amended claim 1, Yoon discloses An autonomous vehicle (Fig. 3, 206 and see at least para. [0013] of Yoon which describes “a vehicle 206 as it moves down the roadway … the vehicle (e.g., an autonomous vehicle”), comprising: a sensor (Fig. 3, 315 and see at least para. [0002] of Yoon which discloses “autonomous vehicles use a combination of sensors such as optical and infrared cameras, LIDAR, and ultrasonic sensors“ and see at least para. [0015] which discloses “vehicles 206 equipped with a RFID reader”, *Examiner interprets the RFID reader to be the claimed sensor since para. [0040] of Applicant’s specification describes a sensor (e.g., a non-visible camera, a receiver, a radio frequency identification (RFID) reader) of the perception module 202 monitoring the surface. Additionally, Applicant’s specification describes how the vehicle 200 includes the perception module 202 which includes the sensor (RFID reader) – see para. [0023] of Applicant’s specification); and one or more processors (Fig. 3, 303 and see at least para. [0016] of Yoon which discloses “a processing system 300 included in a vehicle 206 … The processing system 300 includes at least one processor circuit, for example, having a processor 303”, *Examiner interprets processing system 300 to include processors) configured to: monitor, using the sensor, a surface of a road while the autonomous vehicle is driving on the road (see at least para. [0011] of Yoon which discloses “Sensor based methods used by autonomous vehicles … while driving“ and see at least para. [0026] which discloses “readers and/or sensors can monitor a magnetic paint that can be applied when painting lines on roads”, *Examiner interprets this as the surface of a road is monitored using a sensor); detect a unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification and see at least para. [0018] of Yoon which discloses “the RFID tags 106 can be active tags that can respond with sensor information in addition to identifying information. The RFID reader 315 receives the data from the GatorEye unit 100 and provides it to the processing circuitry for evaluation” and see at least para. [0030] which discloses “ratios, concentrations, amounts, and other numerical data may be expressed herein in a range format”, *Examiner interprets this numerical data as a unique numeral identification) of a tracking tag (Fig. 1, 106 and see at least para. [0013] of Yoon which describes “the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane”) on the surface of the road (see at least para. [0018] of Yoon which describes “the RIFD tags 106 in the GatorEye units 100 located along the edges of the lanes”, *Examiner interprets tracking tag 106 to be on the road)based on data collected from the sensor during the monitoring of the surface (see at least para. [0015] of Yoon which describes “data that was collected” and see at least para. [0018] which discloses “The RFID reader 315 receives the data from the GatorEye unit 100 and provides it to the processing circuitry for evaluation”), the unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification) being specific to the tracking tag and to a designated portion of the road in which the tracking tag is located (see at least the abstract of Yoon which discloses “data obtained from at least one of the RFID tags to determine vehicle location along the transportation surface”, *Examiner interprets the portion of the transportation surface at which the vehicle is located to be a designated portion of the road and see at least para. [0013] of Yoon which discloses “a vehicle 206 as it moves down the roadway, the response 209 from the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane, but the response 209 can also be used to identify which lane the vehicle 206 is in. Due to the RF frequency used in RFID systems, road hazards such as water or snow on the roadway do not interfere with the operation, allowing autonomous vehicles to navigate roads even when the optical cameras cannot find the roadway. The GatorEye system can be implemented for use on highways, as well as city roadways, local driveways, and in parking lots to assist with smart parking (informing parking lot occupancy, vacancy, etc.)”) ; a database (Fig. 3, 324 and see at least para. [0017] of Yoon which discloses “Stored in the memory 306 are both data and several components that are executable by the processor 303. In particular, stored in the memory 306 and executable by the processor 303 are various application modules or programs such as, e.g., a GatorEye module, application, or program 321 for acquisition and evaluation of information obtained from the GatorEye unit 100 via the RFID reader 315. Also stored in the memory 306 may be a data store 324 and other data”, *Examiner interprets data store 324 to be a database) identify first navigational information (see at least para. [0009] which discloses “The autonomous vehicles can use these tags to safely navigate roads”, *Examiner interprets use of the tags to identify navigational information to navigate the roads. And see at least para. [0014] which discloses “with the very precise positioning system, an additional measure of redundancy/security can be incorporated for surrounding vehicles during operations such as lane changing or approaching traffic stops”, *Examiner interprets the positioning system will identify first navigational information such as lane changing (see para. [0006] of Applicant’s own specification which describes the navigational information may indicate a velocity to move at, a direction to go, a lane to be in, or another type of navigational command for the autonomous vehicle to perform) from the database (Fig. 3, 324 and see at least para. [0017] of Yoon which discloses “Stored in the memory 306 are both data and several components that are executable by the processor 303. In particular, stored in the memory 306 and executable by the processor 303 are various application modules or programs such as, e.g., a GatorEye module, application, or program 321 for acquisition and evaluation of information obtained from the GatorEye unit 100 via the RFID reader 315. Also stored in the memory 306 may be a data store 324 and other data”, *Examiner interprets data store 324 to be a database) that corresponds to the detected unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification) of the tracking tag (Fig. 1, 106 and see at least para. [0013] which describes “the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane”) on the surface of the road (see at least para. [0018] of Yoon which describes “the RIFD tags 106 in the GatorEye units 100 located along the edges of the lanes”, *Examiner interprets tracking tag 106 to be on the road); determine a navigational action (see at least para. [0011] of Yoon which discloses “the disclosed system … can take the lead role in vehicle navigation and complement the detection and sensing roles of other existing systems”, *Examiner interprets the fact that the system takes a role in navigation to mean the system will determine a navigational action and see at least para. [0028] which discloses “autonomous vehicle operation, allowing the alternative systems to act as supporting roles for detecting extraneous road hazards”, *Examiner interprets vehicle operation to be navigational action) based on the first navigational information (see at least para. [0014] which discloses “operations such as lane changing or approaching traffic stops”, *Examiner interprets lane changing to be a navigational information on which the navigational action occurs); operate the autonomous vehicle according to the navigational action (see at least para. [0009] of Yoon which discloses “The autonomous vehicles can use these tags to safely navigate roads, even if the lane lines are covered in rain or snow. Additionally, vehicle-to-vehicle communications could be used to add an additional safety measure when performing maneuvers such as changing lanes or slowing at a stop light”, *Examiner interprets this as operating the vehicle based on navigation of roads since lane changes will still occur with caution because of road conditions); detect (see at least para. [0027] of Yoon which discloses “It can be relatively easy to detect abnormalities without causing unsafe operation of the vehicle 206 and replacement of the device would be simple. Detection of the magnetic paint would similarly be easy”, *Examiner interprets this an example of how potential changes in the tracking tags located near magnetic paint can be detected), via the sensor (see at least para. [0027] of Yoon which discloses “Since GatorEye units 100 can be closely spaced along both sides of the lane, the vehicle 206 can use two units 100 to simultaneously determine its lane position”), one or more changes to one or more subsequent tracking tags on the road while the autonomous vehicle is operating according to the navigational action (as discussed above, see at least para. [0011] of Yoon which teaches “the disclosed system … can take the lead role in vehicle navigation and complement the detection and sensing roles of other existing systems”, *Examiner interprets the fact that the system takes a role in navigation to mean the system will determine a navigational action and see at least para. [0028] which discloses “autonomous vehicle operation, allowing the alternative systems to act as supporting roles for detecting extraneous road hazards”, *Examiner interprets vehicle operation to be navigational action), the one or more subsequent tracking tags being specific to different designated portions of the road (see at least para. [0027] of Yoon which discloses “units 100 can be closely spaced along both sides of the lane”, *Examiner interprets the positions at which units 100 are spaced along the lane to be equivalent to tags being placed at different designated portions of the road). Yoon et al. does disclose a tracking tag (Fig. 1, 106 and see at least para. [0013] of Yoon which describes “the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane”) along the surface of the road (see at least para. [0018] which discloses “the RIFD tags 106 in the GatorEye units 100 located along the edges of the lanes”, *Examiner interprets tags located along the lanes to mean the tags 106 are embedded along the surface of the road). Yoon et al. may not explicitly disclose the tracking tag is embedded underneath the surface of the road. However, in the same field of endeavor, Bogatine discloses a tracking tag (Fig. 2, 21 of Bogatine and see para. [0037] of Bogatine which describes “RFID tags 21 has been installed”) embedded underneath the surface of the road (see at least para. [0030] of Bogatine which discloses “RFID tags are located at some depth below the road surface”, *Examiner interprets a location at a depth below the road surface to be embedded underneath the surface of the road). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the tracking tags of Yoon et al. to be located underneath the surface of the road as taught in Bogatine with a reasonable expectation of success in order to protect the RFID tags from possible mechanical damages on a road surface. See para. [0030] of Bogatine for motivation. Further regarding claim 1, Yoon, as modified by Bogatine, does disclose a query in the form of an interrogation of the RFID tag (see at least para. [0018] of Yoon which discloses “The RFID reader 315 can interrogate the RIFD tags 106 in the GatorEye units 100 located along the edges of the lanes”, *Examiner interprets the interrogation to be similar to a query). Yoon et al, as modified by Bogatine, may not explicitly disclose the one or more processors (Fig. 3, 303 of Yoon and see at least para. [0016] of Yoon which discloses “a processing system 300 included in a vehicle 206 … The processing system 300 includes at least one processor circuit, for example, having a processor 303”, *Examiner interprets processing system 300 to include processors) are configured to: query a database using the unique numerical identification of the tracking tag, the database comprising navigational information mapped to different unique numerical identifications of tracking tags; identify first navigational information from the database that corresponds to the detected unique numerical identification of the tracking tag embedded underneath the surface of the road; determine a navigational action based on the first navigational information; operate the autonomous vehicle according to the navigational action and cause the database to be updated by mapping data. However, in the same field of endeavor, Xu et al. disclose one or more processors (Fig. 5, 502 of Xu and see at least para. [0037] of Xu which describes “The processor 502”) configured to: query a database (see at least para. [0007] of Xu which discloses “the at least one processor, cause the apparatus at least to perform: … query a database”) using the unique numerical identification (see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information. The information is notated as {ID, X, Y, Z} in FIG. 2A, where ID is a unique identifier of a specific street light, X is the latitude of the street light. Y is the longitude of the street light, and Z is the height street light. It is noted that the latitude, longitude, and height could alternatively correspond to the latitude, longitude, and height of the RFID tag itself. When the RFID tag of a certain street light is scanned, the unique identifier, ID, is used to access a database”, *Examiner interprets the unique identifier and the latitude, longitude and height to be unique numerical identification/data since they can all be expressed as decimal numbers or coordinates, i.e., numerical identification) of the tracking tag (see at least para. [0006] of Xu which discloses “querying a database using the identifier of the at least one RFID tag to collect information about a location of the at least one RFID tag”, *Examiner interprets this to be the query of the database using the numerical identification of the tracking tag), the database comprising navigational information mapped (see at least para. [0023] of Xu which discloses “The map matching process associates the vehicle's GPS position (latitude, longitude, altitude), and optionally the heading and speed information, to a road segment” and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information”, *Examiner interprets this associated feature as evidence that the information is mapped to unique identifications that may be numerical since they reference latitude and longitude) to different unique numerical identifications of tracking tags (see at least para. [0006] of Xu which discloses “querying a database using the identifier of the at least one RFID tag to collect information about a location of the at least one RFID tag; determining a bias of a vehicle relative to the location of the scanned at least one RFID tag; and calculating a location of the vehicle based on the bias and the location of the at least one RFID tag”, *Examiner interprets information about a location of the RFID tag to be navigational information which will correspond to different latitude, longitude and height numbers, as discussed above); identify first navigational information (see at least para. [0029] of Xu which discloses “At block 404, a database is queried using the identifier of the scanned at least one RFID tag to collect information about a location of the at least one RFID tag. According to a certain embodiment of the invention, the information may include a latitude of the RFID tag, a longitude of the RFID tag and/or a height of the RFID tag.”, *Examiner interprets the latitude of the RFID tag to be the first navigational information) from the database that corresponds to the detected unique numerical identifications of the tracking tag (see at least para. [0023] of Xu which discloses “The databases 106, 108, 110 may store information … The map data 102 is used in a matching process to identify which road segment the vehicle is driving. The map matching process associates the vehicle's GPS position (latitude, longitude, altitude), and optionally the heading and speed information, to a road segment“, *Examiner interprets the latitude, longitude and altitude to be numerical identification of the tracking tag) that can be embedded at a certain location of the road; determine a navigational action based on the first navigational information (see at least para. [0027]of Xu which discloses “At step 310, scanning is performed for street light RFID tags, and while the vehicle is driving, the street light position database is queried using the identifier of the RFID tag as the searching index. At step 312, a hybrid location estimation is performed to calculate the location of the autonomous vehicle. The hybrid location estimation can take account all of the information from steps 302, 308 and 310 … The autonomous vehicle control center can use the reported information to provide a variety of services to manage autonomous vehicles”, *Examiner interprets the variety of services for the management of the autonomous vehicles to include determination of navigational action that is based on the first navigational information such as latitude since step 308 of Xu includes latitude and as discussed above, latitude is considered to be the first navigational information); operate the autonomous vehicle according to the navigational action (see at least para. [0004] of Xu which discloses “autonomous vehicle control centers can keep track of each autonomous vehicle's position and potentially maintain the remote navigation control of each autonomous vehicle”, *Examiner interprets this as operating the autonomous vehicle according to a navigational action) and cause the database to be updated (see at least para. [0004] of Xu which discloses “navigational aids can help an autonomous vehicle identify its location in line of sight situations which allows its position to be updated on a map using map matching techniques”) by mapping data (see at least para. [0023] of Xu which discloses “The map matching process associates the vehicle's GPS position (latitude, longitude, altitude), and optionally the heading and speed information, to a road segment” and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information”, *Examiner interprets this associated feature as evidence that the information is mapped to unique identifications that may be numerical in the fact that they reference latitude and longitude). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the one or more processors of Yoon, as modified by Bogatine, to query a database using the unique numerical identification of the tracking tag, the database comprising navigational information mapped to different unique numerical identifications of tracking tags; identify first navigational information from the database that corresponds to the detected unique numerical identification of the tracking tag embedded underneath the surface of the road; determine a navigational action based on the first navigational information; and operate the autonomous vehicle according to the navigational action and cause the database to be updated by mapping data, as taught in Xu with a reasonable expectation of success in order to reduce inaccuracies in navigational information to avoid a potential impairment to the decision process of an autonomous vehicle so that the vehicle can operate efficiently and without disruptions. See para. [0027] of Xu for motivation. Yoon et al, as modified by Bogatine and Xu, may not explicitly disclose one or more processors configured to … detect, via the sensor, one or more changes to one or more subsequent tracking tag; associated with the detected one or more changes to the unique numerical identifications corresponding to the subsequent tracking tags. However, Fraser discloses processors configured to … detect, via the sensor, one or more changes to one or more subsequent tracking tag (see first paragraph of the summary section on pg. 1 of Fraser which discloses “a detectable change in a characteristic of the tag detected by the sensor”); and have the detected one or more changes to the unique numerical identifications corresponding to the subsequent tracking tags (see at least pg. 9 ln. 21-23 of Fraser which discloses “the tag may simply transmit a unique, or unique within a particular set of tags, identifier (ID)“) . It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the one or more processors of Yoon, as modified by Bogatine and Xu to detect, via the sensor, one or more changes to one or more subsequent tracking tag; and have the detected one or more changes to the unique numerical identifications corresponding to the subsequent tracking tags., as taught in Fraser with a reasonable expectation of success in order to increase roadway system improvement for better navigation by autonomous vehicles. Regarding claim 2, Yoon, as modified by Bogatine, Xu and Fraser, discloses wherein the first navigational information (see at least para. [0009] of Yoon which discloses “The autonomous vehicles can use these tags to safely navigate roads”, *Examiner interprets use of the tags to identify navigational information to navigate the roads. And see at least para. [0014] of Yoon which discloses “with the very precise positioning system, an additional measure of redundancy/security can be incorporated for surrounding vehicles during operations such as lane changing or approaching traffic stops”, *Examiner interprets the positioning system will identify first navigational information such as lane changing (see para. [0006] of Applicant’s own specification which describes the navigational information may indicate a velocity to move at, a direction to go, a lane to be in, or another type of navigational command for the autonomous vehicle to perform) indicates the navigational action (see at least para. [0013] which discloses “Due to the RF frequency used in RFID systems, road hazards such as water or snow on the roadway do not interfere with the operation, allowing autonomous vehicles to navigate roads even when the optical cameras cannot find the roadway”, *Examiner interprets this ability to navigate roads to be the indication of the navigational action based on the navigational information obtained). Regarding claim 6, Yoon, as modified by Bogatine, Xu and Fraser, discloses wherein the sensor (Fig. 3, 315 and see at least para. [0002] of Yoon which discloses “autonomous vehicles use a combination of sensors such as optical and infrared cameras, LIDAR, and ultrasonic sensors“) is a radio frequency identification (RFID) reader (see at least para. [0015] which discloses “vehicles 206 equipped with a RFID reader”, *Examiner interprets the RFID reader to be the claimed sensor since para. [0040] of Applicant’s specification describes a sensor (e.g., a non-visible camera, a receiver, a radio frequency identification (RFID) reader) of the perception module 202 monitoring the surface) and the tracking tag is an RFID tag (Fig. 1, 106 and see at least para. [0013] which describes “the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane”). Regarding amended claim 7, Yoon, as modified by Bogatine, Xu and Fraser, discloses all the limitations of claim 1 which includes wherein the one or more processors query the database, as discussed above. Yoon and Xu disclose further transmitting, to a remote database across a network, a first message comprising the unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information. The information is notated as {ID, X, Y, Z} in FIG. 2A, where ID is a unique identifier of a specific street light, X is the latitude of the street light. Y is the longitude of the street light, and Z is the height street light. It is noted that the latitude, longitude, and height could alternatively correspond to the latitude, longitude, and height of the RFID tag itself. When the RFID tag of a certain street light is scanned, the unique identifier, ID, is used to access a database”, *Examiner interprets the latitude, longitude and height to be numerical identification/data since they can all be expressed as decimal numbers or coordinates, i.e., numerical identification); and receiving, from the remote database across the network, a second message comprising the first navigational information (see at least para. [0023] of Yoon which discloses “the local interface 309 may be an appropriate network that facilitates communication between any two of the multiple processors 303, between any processor 303 and any of the memories 306, or between any two of the memories 306, etc.” and see at least para. [0004] of Xu which discloses “autonomous vehicle control centers can keep track of each autonomous vehicle's position and potentially maintain the remote navigation control of each autonomous vehicle”, *Examiner interprets the communication to include first and second messages). Regarding amended claim 8, Yoon, as modified by Bogatine, Xu and Fraser, discloses all the limitations of claim 1 which includes wherein the one or more processors query the database by: sending, to a local database of the autonomous vehicle, a first message comprising the unique numerical information (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification and see at least para. [0024] of Xu which discloses “Each RFID tag is associated with certain information. The information is notated as {ID, X, Y, Z} in FIG. 2A, where ID is a unique identifier of a specific street light, X is the latitude of the street light. Y is the longitude of the street light, and Z is the height street light. It is noted that the latitude, longitude, and height could alternatively correspond to the latitude, longitude, and height of the RFID tag itself. When the RFID tag of a certain street light is scanned, the unique identifier, ID, is used to access a database”, *Examiner interprets the latitude, longitude and height to be numerical identification/data since they can all be expressed as decimal numbers or coordinates, i.e., numerical identification), as discussed above. Yoon and Xu disclose further obtaining, from the local database (see at least para. [0017] of Yoon which discloses “Stored in the memory 306 are both data and several components that are executable by the processor 303”, *Examiner interprets the memory to house the local database), a second message comprising the first navigational identification (see at least para. [0023] of Yoon which discloses “the local interface 309 may be an appropriate network that facilitates communication between any two of the multiple processors 303, between any processor 303 and any of the memories 306, or between any two of the memories 306, etc.” and see at least para. [0004] of Xu which discloses “autonomous vehicle control centers can keep track of each autonomous vehicle's position and potentially maintain the remote navigation control of each autonomous vehicle”, *Examiner interprets the communication to include first and second messages). Regarding claim 9, Yoon, as modified by Bogatine, Xu and Fraser further discloses wherein the one or more processors are configured to: receive, from a remote computer across a network, an update associated with the local database; and update the local database based on the received update (see at least para. [0015] of Yoon which discloses “a system to process the information from vehicles 206 equipped with a RFID reader to create accurate and constantly updating traffic maps of road conditions/hazards, accidents, traffic levels, etc.”). Regarding claim 10, Yoon, as modified by Bogatine, Xu and Fraser, discloses wherein the unique numerical identification corresponds to a location (see at least para. [0018] of Yoon which discloses “information that can be used to determine location information“) and the location corresponds to the first navigational information (see at least para. [0019] which discloses “The information can be used by the processing system 300 to determine, e.g., location”, *Examiner interprets the processing system will process the location that corresponds to the navigational information). Regarding claim 11, Yoon, as modified by Bogatine, Xu and Fraser, discloses wherein the one or more processors (Fig. 3, 303 of Yoon and see at least para. [0016] of Yoon which discloses “a processing system 300 included in a vehicle 206 … The processing system 300 includes at least one processor circuit, for example, having a processor 303”, *Examiner interprets processing system 300 to include processors) detect the numerical identification (see at least para. [0018] of Yoon which discloses “the RFID tags 106 can be active tags that can respond with sensor information in addition to identifying information. The RFID reader 315 receives the data from the GatorEye unit 100 and provides it to the processing circuitry for evaluation” and see at least para. [0030] of Yoon which discloses “ratios, concentrations, amounts, and other numerical data may be expressed herein in a range format”, *Examiner interprets this numerical data as a numeral identification) one or more changes (see first paragraph of the summary section on pg. 1 of Fraser which discloses “a detectable change in a characteristic of the tag detected by the sensor”) by: detecting (see at least para. [0027] of Yoon which discloses “It can be relatively easy to detect abnormalities without causing unsafe operation of the vehicle 206 and replacement of the device would be simple. Detection of the magnetic paint would similarly be easy”, *Examiner interprets this an example of how potential changes in the tracking tags located near magnetic paint can be detected), via the sensor (Fig. 3, 315 and see at least para. [0002] of Yoon which discloses “autonomous vehicles use a combination of sensors such as optical and infrared cameras, LIDAR, and ultrasonic sensors“ and see at least para. [0015] which discloses “vehicles 206 equipped with a RFID reader”, *Examiner interprets the RFID reader to be the claimed sensor since para. [0040] of Applicant’s specification describes a sensor (e.g., a non-visible camera, a receiver, a radio frequency identification (RFID) reader) of the perception module 202 monitoring the surface. Additionally, Applicant’s specification describes how the vehicle 200 includes the perception module 202 which includes the sensor (RFID reader) – see para. [0023] of Applicant’s specification), a second tracking tag of the one or more subsequent tracking tags (Fig. 1, second instance of 106 in Yoon and see para. [0012] of Yoon which discloses “integrated RFID tags 106”, *Examiner notes that tags are plural, thus indicating more than one); and adjusting operation of the autonomous vehicle according to a second navigational action (see at least para. [0009] which discloses “performing maneuvers such as changing lanes or slowing at a stop light”, *Examiner interprets these maneuvers to include a second navigational action since they are plural) based on a response of the second tracking tag to a signal transmitted to the second tracking tag by the sensor (see at least para. [0013] of Yoon which discloses “a vehicle 206 as it moves down the roadway, the response 209 from the RFID tag 106 can be used by the vehicle (e.g., an autonomous vehicle) to know if it is correctly positioned in the lane, but the response 209 can also be used to identify which lane the vehicle 206 is in”). Regarding claim 12, Yoon discloses A computer-implemented method (see at least para. [0025] of Yoon which discloses “computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor 303 in a computer system”), comprising: monitoring, by one or more processors (Fig. 3, 303 and see at least para. [0016] of Yoon which discloses “a processing system 300 included in a vehicle 206 … The processing system 300 includes at least one processor circuit, for example, having a processor 303”, *Examiner interprets processing system 300 to include processors) via a sensor (Fig. 3, 315 and see at least para. [0002] of Yoon which discloses “autonomous vehicles use a combination of sensors such as optical and infrared cameras, LIDAR, and ultrasonic sensors“ and see at least para. [0015] which discloses “vehicles 206 equipped with a RFID reader”, *Examiner interprets the RFID reader to be the claimed sensor since para. [0040] of Applicant’s specification describes a sensor (e.g., a non-visible camera, a receiver, a radio frequency identification (RFID) reader) of the perception module 202 monitoring the surface. Additionally, Applicant’s specification describes how the vehicle 200 includes the perception module 202 which includes the sensor (RFID reader) – see para. [0023] of Applicant’s specification), a surface of a road while an autonomous vehicle is driving on the road (see at least para. [0011] of Yoon which discloses “Sensor based methods used by autonomous vehicles … while driving“ and see at least para. [0026] which discloses “readers and/or sensors can monitor a magnetic paint that can be applied when painting lines on roads”, *Examiner interprets this as the surface of a road is monitored using a sensor); detecting, by the one or more processors, a unique numerical identification (see at least para. [0014] of Yoon which discloses “many unique addresses can be incorporated into the RFID tag 106”, *Examiner interprets the unique addresses to be unique numerical identification and see at least para. [0018] of Yoon which discloses “the RFID tags 106 can be active tags that can respond with sensor information in addition to identifying information. The RFID reader 315 receives the data from the GatorEye unit 100 and provides it to the processing circuitry for evaluation” and see at least para. [0030] which discloses “ratios, concentrations, amounts, and other numerical data may be expressed herein in a range format”, *Examiner interprets this numerical data
Read full office action

Prosecution Timeline

Jul 07, 2023
Application Filed
Mar 31, 2025
Non-Final Rejection — §103, §112
Jun 18, 2025
Interview Requested
Jun 26, 2025
Examiner Interview Summary
Jun 26, 2025
Applicant Interview (Telephonic)
Jul 02, 2025
Response Filed
Sep 29, 2025
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582033
SYSTEMS AND METHODS FOR AUTOMATED GRAIN CART UNLOADING
2y 5m to grant Granted Mar 24, 2026
Patent 12384422
AUTONOMOUS DRIVING CONTROL APPARATUS AND METHOD THEREOF
2y 5m to grant Granted Aug 12, 2025
Patent 12365385
VEHICLE DRIFT CONTROL METHOD AND APPARATUS, VEHICLE, STORAGE MEDIUM AND CHIP
2y 5m to grant Granted Jul 22, 2025
Patent 12344308
A VEHICLE STRUCTURE
2y 5m to grant Granted Jul 01, 2025
Patent 12344323
VEHICLE AERODYNAMIC IMPROVEMENT APPARATUS AND SYSTEM
2y 5m to grant Granted Jul 01, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
90%
Grant Probability
97%
With Interview (+7.3%)
2y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 762 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month