Prosecution Insights
Last updated: April 19, 2026
Application No. 18/108,168

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING SYSTEM FOR CONTROLLING OPERATION OF AUTONOMOUS VEHICLES BASED ON VISIBILITY OF LANES

Final Rejection §103
Filed
Feb 10, 2023
Examiner
MATTA, ALEXANDER GEORGE
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Jidosha Kabushiki Kaisha
OA Round
4 (Final)
72%
Grant Probability
Favorable
5-6
OA Rounds
3y 0m
To Grant
94%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
98 granted / 137 resolved
+19.5% vs TC avg
Strong +23% interview lift
Without
With
+22.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
42 currently pending
Career history
179
Total Applications
across all art units

Statute-Specific Performance

§101
8.5%
-31.5% vs TC avg
§103
54.2%
+14.2% vs TC avg
§102
13.0%
-27.0% vs TC avg
§112
21.7%
-18.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 137 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Office Action is in response to Applicant Amendment and Arguments filed on 7/8/2025. This Action is made FINAL. Claims 6-7 and 16-17 have been canceled Claim(s) 1, 8-11, and 18-20 are pending for examination. Response to Arguments Applicant's arguments with respect to the previous rejection of claims 1, 6-11, and 16-20 under 35 U.S.C. 103 have been considered but are not persuasive. Applicant argues: “However, amended independent claims 1 and 11 recite "when the rate of change in the visibility of the lane lines in the specific place in relation to a lapse of time is equal to or higher than a predetermined value, excluding the first data collected before a specific point of time in determining the first area". That is, the first data collected before a specific point of time is only excluded under certain circumstances, namely when the rate of change in the visibility of the lane lines in the specific place in relation to a lapse of time is equal to or higher than a predetermined value. This is in sharp contrast to Vidyakina, in which older trajectory data is excluded categorically, and not just in the particular circumstances recited in the claims. Therefore, Applicant respectfully submits that the cited references fails to teach or suggest the features of amended independent claims 1 and 11.” Examiner disagrees. Vidyakina is not teaching categorically that all “old” data should be excluded rather that old data is excluded when the data stream is highly dynamic (high rate of change). Vidyakina discloses two scenarios. Older data should not be excluded for sign recognition and mapping because it is not very dynamic in nature but for traffic service which is more dynamic in nature older data should be excluded. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 8-11, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bai et al. (US 20200262438 A1, hereinafter known as Bai) in view of Zhao et al. (US 20220348224 A1, hereinafter known as Zhao), Liu (US 20200117916 A1), He (US 20210312195 A1), and Vidyakina et al (US 20220394425 A1, hereinafter known as Vidyakina). Bai, Zhao, Liu, He, and Vidyakina were cited in a previous office action Regarding Claim 1, Bai teaches An information processing apparatus comprising a controller including at least one processor configured to execute the processing of: {Fig.2 and Para [0037] “The components of the dynamic roadway sensing system 100 of FIG. 1 can be configured for computer communication. An exemplary configuration is shown in FIG. 2 within an operating environment 200. The operating environment 200 includes a connected vehicle 202 (e.g., the connected devices 106), other connected vehicles 204 (e.g., the connected devices 106), a road condition monitoring system server 206 (e.g., the road condition monitoring system 102), and a roadway manager system server 208 (e.g., the roadway manager system 104). Although not shown, other servers, connected devices, data stores and systems (e.g., other road condition data sources 110), and other vehicles (e.g., roadway manager vehicles 112) can include some or all of the computer components shown with respect to the components in FIG. 2. Thus, it is understood that the components of the operating environment 200, the connected vehicle 202, the other connected vehicles 204, the road condition monitoring system server 206, the roadway manager system server 208, and the as well as the components of other systems, hardware architectures, and software architectures discussed herein, can be combined, omitted, or organized into different architectures for various embodiments.” } collecting from a plurality of first vehicles first data relating to conditions of lane lines located in a neighborhood of the first vehicles; {Para [0034] “The road condition monitoring system 102 acquires data from connected devices 106. As used herein, the connected devices 106 can include vehicles, users, and/or infrastructure operatively connected for computer communication with the road condition monitoring system 102. For example, the devices 106 can include vehicles and users in an OEM network that are operably connected for computer communication as defined herein. The connected devices 106 can also include roadside devices, traffic infrastructures, and portable devices (e.g., associated with a vehicle occupant, pedestrian, other road users), among others, that are operably connected for computer communication as defined herein. As will be discussed, roadway data can be captured by the connected devices 106 using image sensors, vehicle sensors, among other types of sensors.” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” } determining a first area in which visibility of the lane lines is equal to or lower than a predetermined value based on the first data; { Para [0067] “According to another example, the road condition monitoring system 102 can detect, monitor, and control maintenance of lane lines. Currently, some roadway managers are manually inspecting roadways for lane line degradation and/or relying on road users to report traffic flow issues that can be caused by lane line degradation. Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” } and executing specific processing for control of an autonomous vehicle when in the first area. {Para [0078] “In another embodiment, autonomous vehicle control of connected vehicles on the roadway can be executed based on the roadway data and the road condition monitoring system. Settings of an autonomous vehicle and/or an ADAS can be controlled. For example, the system can adjust driver hand-over time with earlier warnings to upcoming driving environment changes, such as a hand-over time for a change lane detection to be more or less sensitive. As an illustrative example, a change lane detection contrast setting of a lane detection system can be modified based on a level of visibility of lane markings. Typically, if blurry lane lines are detected, the lane detection system may turn off detection output. However, the system does not have to turn off lane detection if the connected vehicle determines an upcoming road segment has blurry lane markings (e.g., 75% visibility), which increases the lane detection system confidence. It is understood that similar methods applied to autonomous vehicle control and ADAS can be applied, for example, with signage condition monitoring and other road condition monitoring.” } wherein the first data includes location information of the first vehicle and a vehicle-view moving image. {Para [0047] “Referring now to FIG. 7 a method 700 for condition monitoring is shown according to one embodiment. At block 702, the method 700 includes capturing roadway data. For example, the roadway data can be captured by the connected vehicle 202 using the vehicle systems 218 and/or the sensors 220. In other embodiments, roadway data can be captured by other connected vehicles 204, connected nfrastructures/devices, and/or accessed from the other road condition data sources 110.” Para [0043] “The sensors 220, which can be implemented with the vehicle systems 218, can include various types of sensors for use with the vehicle 202 and/or the vehicle systems 218 for detecting and/or sensing a parameter of the vehicle 202, the vehicle systems 218, and/or the environment surrounding the vehicle 202. For example, the sensors 220 can capture and provide roadway data as discussed herein. The sensors 220 can include, but are not limited to: acceleration sensors, speed sensors, braking sensors, proximity sensors, vision sensors, ranging sensors, seat sensors, seat-belt sensors, door sensors, environmental sensors, yaw rate sensors, steering sensors, GPS sensors, among others. It is also understood that the sensors 220 can be any type of sensor, for example, acoustic, electric, environmental, optical, imaging, light, pressure, force, thermal, temperature, proximity, among others. In the embodiments discussed herein, the sensors 220 can be used to capture roadway data.” Para [0051] “At block 708, the method 700 includes transmitting the data to the roadway manager system 104. The data can include the roadway data, the condition, and/or the priority level along with other data related to the location, condition, priority level, etc. In some embodiments, the data includes data aggregated and/or analyzed by the road condition monitoring system 102.” Para [0042-0043] discuss how the image sensor used is the one for an autonomous driving system and therefore the images are designed to be captured while the vehicle is in motion. Additionally video capture is the common data captured by autonomous vehicles for autonomous driving. } use the vehicle-view moving image {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” } determining a degree of agreement between the image-recognized lane lines and lane lines defined in a database at intervals {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” Para [0073] “As mentioned above, the roadway data, classifications and/or priority levels can be communicated to the roadway manager system server 208 and/or this information can be automatically updated. The communication and/or update timing can be on-demand, periodic, contingent or particular criteria, and/or any combination of the aforementioned. For example, if the roadway manager system 104 is interested in a particular road segment due to weather conditions and/or roadway conditions, the road condition monitoring system 102 can receive, determine and transmit information about the road segment on-demand. In another embodiment, the information can be transmitted at a predetermined time period, for example, once a week, once a month.” Where on demand can be considered a real-time interval but also non real time intervals are contemplated though not explicitly for the determination step. } wherein the controller determines the visibility of the lane lines by comparing the locations of the lane lines detected and the locations of lane lines defined in a database. {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” } determining a change of the visibility of the lane lines in a specific place with time on the basis of the first data collected at a plurality of points of time; {Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” Para [0073] “As mentioned above, the roadway data, classifications and/or priority levels can be communicated to the roadway manager system server 208 and/or this information can be automatically updated. The communication and/or update timing can be on-demand, periodic, contingent or particular criteria, and/or any combination of the aforementioned. For example, if the roadway manager system 104 is interested in a particular road segment due to weather conditions and/or roadway conditions, the road condition monitoring system 102 can receive, determine and transmit information about the road segment on-demand. In another embodiment, the information can be transmitted at a predetermined time period, for example, once a week, once a month.” } when the rate of change in the visibility of the lane lines in the specific place in relation to a lapse of time is equal to or higher than a predetermined value, the controller {Para [0074] “In other embodiments, the information can be transmitted based on the roadway data and/or other detected conditions and levels. For example, based on a particular weather condition, roadway condition, visibility condition, and other predefined trigger conditions. As an illustrative example, the road condition monitoring system 102 can transmit information about a road segment upon detecting a severe pothole, a medium snowfall, and low visibility. In another embodiment, the road condition monitoring system 102 can monitor for sudden relative change based on time, weather conditions, roadway conditions, and debris as compared to absolute change over the time. As an illustrative example, pavement degradation over two years. This allows roadway managers to manage roadways proactively.” Para [0066] “However, based on roadway data indicating a sudden change in the roadway (e.g., a large pothole) or a severe weather event, the repair classification can be modified to short term.” As discussed previous roadway conditions can be related to lane line visibility and controller is looking for sudden changes in conditions rather than longer term changes in conditions. } Bai does not explicitly teach, executing specific processing for preventing entry of an autonomous vehicle into the first area. and performing image segmentation of the vehicle-view moving image using a machine learning model to image recognize lane lines; and at intervals of a predetermined number of frames of the vehicle-view moving image; and the controller excludes the first data collected before a specific point of time However, Zhao teaches executing specific processing for preventing entry of an autonomous vehicle into the first area. {Para [0057-0058] “A broad command 130a may be related to a particular unexpected road condition 156 that applies to one or more AVs 802 generally. The broad command 130a may be directed to one or more AVs 802, for example, on a particular road 102 that are headed toward the particular unexpected road condition 156. As illustrated in FIG. 1, the AVs 802 are traveling along the road 102, where the AV 802a is ahead of the AV 802b, and the AVs 802 may encounter different unexpected road conditions 156. For each unexpected road condition 156, the operation server 120 may issue a particular command 130 (i.e., an operation server-to-AV command 132), as described below. [0058] For example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that there is a road closure 104 on the road 102, and to find another routing plan 144 (e.g., a routing plan 144 that provides the safest driving experience) to reach the destination. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that severe weather is detected (e.g., based on the weather data 152) on the road 102 ahead of the lead AV 802a, and to find a next exit (e.g., exit 112), and pull over. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected object 106 (that is not detected in the map data 142) is detected on the road 102, and to navigate around the unexpected object 106. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected construction zone 108 (that is not detected in the map data 142) is detected on the road 102, and find a spot on a side of the road 102 to pull over.” } performing object recognition of the vehicle-view moving image using a machine learning model to image recognize lane lines; {Para [0047] “Control device 850 is generally configured to control the operation of the AV 802 and its components. The control device 850 is further configured to determine a pathway in front of the AV 802 that is safe to travel and free of objects/obstacles, and navigate the AV 802 to travel in that pathway. This process is described in more detail in FIGS. 8-10. The control device 850 generally includes one or more computing devices in signal communication with other components of the AV 802 (see FIG. 8). The control device 850 receives sensor data 148 from one or more sensors 846 positioned on the AV 802 to determine a safe pathway to travel. The sensor data 148 includes data captured by the sensors 846. Sensors 846 are configured to capture any object within their detection zones or fields of view, such as landmarks, lane markings, lane boundaries, road boundaries, vehicles, pedestrians, road/traffic signs, among others. The sensors 846 may include cameras, LiDAR sensors, motion sensors, infrared sensors, and the like. In one embodiment, the sensors 846 may be positioned around the AV 802 to capture the environment surrounding the AV 802.” Para [0073] “In one embodiment, the object detection machine learning modules 140 may be implemented using machine learning algorithms, such as Support Vector Machine (SVM), Naive Bayes, Logistic Regression, k-Nearest Neighbors, Decision Trees, or the like. In one embodiment, the object detection machine learning modules 140 may utilize a plurality of neural network layers, convolutional neural network layers, and/or the like, in which weights and biases of these layers are optimized in the training process of the object detection machine learning modules 140. The object detection machine learning modules 140 may be trained by a training dataset which includes samples of data types labeled with one or more objects in each sample. For example, the training dataset may include sample images of objects (e.g., vehicles, lane markings, pedestrian, road signs, obstacles 118, etc.) labeled with object(s) in each sample image. Similarly, the training dataset may include samples of other data types, such as videos, infrared images, point clouds, Radar data, etc. labeled with object(s) in each sample data. The object detection machine learning modules 140 may be trained, tested, and refined by the training dataset and the sensor data 148. The object detection machine learning modules 140 use the sensor data 148 (which are not labeled with objects) to increase their accuracy of predictions in detecting objects. For example, supervised and/or unsupervised machine learning algorithms may be used to validate the predictions of the object detection machine learning modules 140 in detecting objects in the sensor data 148.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Bai to incorporate the teachings of Zhao to prevent entry into an area with unexpected road condition (e.g. reduced lane line visibility) because it improves safety as avoiding an area with bad conditions reduces that chance of an accident. Bai in view of Zhao does not explicitly teach, performing image segmentation of the vehicle-view moving image using a machine learning model to image recognize lane lines; and at intervals of a predetermined number of frames of the vehicle-view moving image; and the controller excludes the first data collected before a specific point of time However Liu teaches performing image segmentation of the vehicle-view moving image using a machine learning model to image recognize lane lines; {para [0019] “Embodiments of the disclosures disclose a deep learning image segmentation system for an ADV to detect whole lane lines. According to one aspect, a system receives a captured image perceiving one or more lane markers surrounding the ADV from an image capturing device of the ADV. The system detects one or more continuous lane lines based on the one or more lane markers (e.g., painted roadway dashed lane markers, botts' dots, raised pavement markers, reflective pavement markers, division lines between new and old roads curbs, or a combination thereof) in the captured image by applying a machine learning model to the captured image, where the machine learning model includes a number of layers of nodes and the machine learning model includes a weighted softmax cross-entropy loss within at least one of the layers in training. The system generates a trajectory based on the one or more continuous (e.g., whole) lane lines to control the ADV autonomously according to the trajectory.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Bai in view of Zhao to incorporate the teachings of Liu to use machine learning based image segmentation because as discussed in Zhao para [0003] “Image segmentation models can be used to detect lane lines. In computer imaging, image segmentation is a process of partitioning or grouping a digital image into segments of pixels to be more meaningful for analysis. Tradition feature-based image segmentation cannot detect a whole lane line and a length of a predicted lane may be too short further analysis. In addition, feature-based image segmentation may fail to detect some lane lines.” Para [0019] “Embodiments of the disclosures disclose a deep learning image segmentation system for an ADV to detect whole lane lines.” Bai in view of Zhao and Liu does not explicitly teach, at intervals of a predetermined number of frames of the vehicle-view moving image; and the controller excludes the first data collected before a specific point of time He teaches performing processing at intervals of a predetermined number of frames of the vehicle-view moving image; {para [0046] “In some embodiments, after acquiring a video stream including a lane marking, the lane marking detecting apparatus can extract individual image frames in the video stream, so as to perform lane marking detection based on the extracted image frames. Specifically, a predetermined interval of frames can be pre-configured, and images extracted based on the predetermined interval of frames are referred to as key image frames, while the other extracted images are referred to as non-key image frames.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Bai in view of Zhao in view of Liu to incorporate the teachings of He to have processing performed at an interval of frames because it can improve efficiency as processing every frame can be computationally inefficient. (He para [0004] “In the prior art, a typically used lane marking detecting method is: collecting training samples in a training process, and training a base network model based on the training samples to obtain a lane marking detecting model. In an application process, a feature map can be determined for each image frame based on the trained lane marking detecting model, and lane marking detection can be performed on the feature map based on the lane marking detecting model. However, since a feature map have to be determined for every image frame by the lane marking detecting model, low detection efficiency and high cost can be an issue. This application provides a lane marking detecting method, an apparatus, an electronic device, a storage medium, a program product, and a vehicle for improving the efficiency of lane marking detection.”) Bai in view of Zhao, Liu, and He does not teach, and the controller excludes the first data collected before a specific point of time However, Vidyakina teaches when the rate of change of road data in the specific place in relation to the lapse of time is equal to or higher than a predetermined value, the controller excludes the first data collected before a specific point of time {Para [0069] “Each service provider has some allowed latency time in the trajectory data they receive. For example, trajectory data that is an hour old may be useful for sign recognition, map building/healing, etc., but has no value for traffic services as traffic changes in a much more dynamic manner, such every minute or two. For a traffic service the latency has to be as low as possible (e.g., <=allowed latency). For a traffic service, the data provider must receive data in relatively short chunks (e.g., 60 seconds), anonymize the data, and provide the data in less time than the allowed latency.” } It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Bai in view of Zhao, Liu, and He to incorporate the teachings of Vidyankina to exclude old data if there more dynamic changes (e.g. high rate of change) because old data is not representative of current trends if the data is very dynamic (e.g. has a high rate of change), and will thus lead to inaccurate decision making. Regarding Claim 8, Bai in view of Zhao, Liu, He, and Vidyakina teaches The information processing apparatus according to claim 1. Zhao further teaches wherein the controller executes processing of creating a travel route keeping out of the first area for the autonomous vehicle, as the specific processing. {Para [0057-0058] “A broad command 130a may be related to a particular unexpected road condition 156 that applies to one or more AVs 802 generally. The broad command 130a may be directed to one or more AVs 802, for example, on a particular road 102 that are headed toward the particular unexpected road condition 156. As illustrated in FIG. 1, the AVs 802 are traveling along the road 102, where the AV 802a is ahead of the AV 802b, and the AVs 802 may encounter different unexpected road conditions 156. For each unexpected road condition 156, the operation server 120 may issue a particular command 130 (i.e., an operation server-to-AV command 132), as described below. [0058] For example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that there is a road closure 104 on the road 102, and to find another routing plan 144 (e.g., a routing plan 144 that provides the safest driving experience) to reach the destination. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that severe weather is detected (e.g., based on the weather data 152) on the road 102 ahead of the lead AV 802a, and to find a next exit (e.g., exit 112), and pull over. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected object 106 (that is not detected in the map data 142) is detected on the road 102, and to navigate around the unexpected object 106. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected construction zone 108 (that is not detected in the map data 142) is detected on the road 102, and find a spot on a side of the road 102 to pull over.” } Regarding Claim 9, Bai in view of Zhao, Liu, He, and Vidyakina teaches The information processing apparatus according to claim 1. Zhao further teaches wherein the controller executes processing of stopping the operation of the autonomous vehicle passing the first area, as the specific processing. {Para [0057-0058] “A broad command 130a may be related to a particular unexpected road condition 156 that applies to one or more AVs 802 generally. The broad command 130a may be directed to one or more AVs 802, for example, on a particular road 102 that are headed toward the particular unexpected road condition 156. As illustrated in FIG. 1, the AVs 802 are traveling along the road 102, where the AV 802a is ahead of the AV 802b, and the AVs 802 may encounter different unexpected road conditions 156. For each unexpected road condition 156, the operation server 120 may issue a particular command 130 (i.e., an operation server-to-AV command 132), as described below. [0058] For example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that there is a road closure 104 on the road 102, and to find another routing plan 144 (e.g., a routing plan 144 that provides the safest driving experience) to reach the destination. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that severe weather is detected (e.g., based on the weather data 152) on the road 102 ahead of the lead AV 802a, and to find a next exit (e.g., exit 112), and pull over. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected object 106 (that is not detected in the map data 142) is detected on the road 102, and to navigate around the unexpected object 106. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected construction zone 108 (that is not detected in the map data 142) is detected on the road 102, and find a spot on a side of the road 102 to pull over.” } Regarding Claim 10, Bai in view of Zhao, Liu, He, and Vidyakina teaches The information processing apparatus according to claim 1. Zhao further teaches wherein controller executes processing of changing a travel route of the autonomous vehicle passing the first area, as the specific processing. {Para [0057-0058] “A broad command 130a may be related to a particular unexpected road condition 156 that applies to one or more AVs 802 generally. The broad command 130a may be directed to one or more AVs 802, for example, on a particular road 102 that are headed toward the particular unexpected road condition 156. As illustrated in FIG. 1, the AVs 802 are traveling along the road 102, where the AV 802a is ahead of the AV 802b, and the AVs 802 may encounter different unexpected road conditions 156. For each unexpected road condition 156, the operation server 120 may issue a particular command 130 (i.e., an operation server-to-AV command 132), as described below. [0058] For example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that there is a road closure 104 on the road 102, and to find another routing plan 144 (e.g., a routing plan 144 that provides the safest driving experience) to reach the destination. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that severe weather is detected (e.g., based on the weather data 152) on the road 102 ahead of the lead AV 802a, and to find a next exit (e.g., exit 112), and pull over. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected object 106 (that is not detected in the map data 142) is detected on the road 102, and to navigate around the unexpected object 106. In another example, the operation server 120 may communicate a broad command 130a to one or more AVs 802 indicating that an unexpected construction zone 108 (that is not detected in the map data 142) is detected on the road 102, and find a spot on a side of the road 102 to pull over.” } Regarding Claim 11, Bai teaches An information processing system comprising a server apparatus and a vehicle {Fig.2 and Para [0037] “The components of the dynamic roadway sensing system 100 of FIG. 1 can be configured for computer communication. An exemplary configuration is shown in FIG. 2 within an operating environment 200. The operating environment 200 includes a connected vehicle 202 (e.g., the connected devices 106), other connected vehicles 204 (e.g., the connected devices 106), a road condition monitoring system server 206 (e.g., the road condition monitoring system 102), and a roadway manager system server 208 (e.g., the roadway manager system 104). Although not shown, other servers, connected devices, data stores and systems (e.g., other road condition data sources 110), and other vehicles (e.g., roadway manager vehicles 112) can include some or all of the computer components shown with respect to the components in FIG. 2. Thus, it is understood that the components of the operating environment 200, the connected vehicle 202, the other connected vehicles 204, the road condition monitoring system server 206, the roadway manager system server 208, and the as well as the components of other systems, hardware architectures, and software architectures discussed herein, can be combined, omitted, or organized into different architectures for various embodiments.” } wherein the vehicle comprises a first controller including at least one processor that transmits first data relating to conditions of lane lines located in a neighborhood of the vehicle to the server apparatus, {Para [0034] “The road condition monitoring system 102 acquires data from connected devices 106. As used herein, the connected devices 106 can include vehicles, users, and/or infrastructure operatively connected for computer communication with the road condition monitoring system 102. For example, the devices 106 can include vehicles and users in an OEM network that are operably connected for computer communication as defined herein. The connected devices 106 can also include roadside devices, traffic infrastructures, and portable devices (e.g., associated with a vehicle occupant, pedestrian, other road users), among others, that are operably connected for computer communication as defined herein. As will be discussed, roadway data can be captured by the connected devices 106 using image sensors, vehicle sensors, among other types of sensors.” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” } and the server apparatus comprises a second controller including at least one processor that determines a first area in which visibility of the lane lines is equal to or lower than a predetermined value based on the first data { Para [0067] “According to another example, the road condition monitoring system 102 can detect, monitor, and control maintenance of lane lines. Currently, some roadway managers are manually inspecting roadways for lane line degradation and/or relying on road users to report traffic flow issues that can be caused by lane line degradation. Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” } and executes specific processing for control of an autonomous vehicle when in the first area, {Para [0078] “In another embodiment, autonomous vehicle control of connected vehicles on the roadway can be executed based on the roadway data and the road condition monitoring system. Settings of an autonomous vehicle and/or an ADAS can be controlled. For example, the system can adjust driver hand-over time with earlier warnings to upcoming driving environment changes, such as a hand-over time for a change lane detection to be more or less sensitive. As an illustrative example, a change lane detection contrast setting of a lane detection system can be modified based on a level of visibility of lane markings. Typically, if blurry lane lines are detected, the lane detection system may turn off detection output. However, the system does not have to turn off lane detection if the connected vehicle determines an upcoming road segment has blurry lane markings (e.g., 75% visibility), which increases the lane detection system confidence. It is understood that similar methods applied to autonomous vehicle control and ADAS can be applied, for example, with signage condition monitoring and other road condition monitoring.” } wherein the first data includes location information of the first vehicle and a vehicle-view moving image. {Para [0047] “Referring now to FIG. 7 a method 700 for condition monitoring is shown according to one embodiment. At block 702, the method 700 includes capturing roadway data. For example, the roadway data can be captured by the connected vehicle 202 using the vehicle systems 218 and/or the sensors 220. In other embodiments, roadway data can be captured by other connected vehicles 204, connected nfrastructures/devices, and/or accessed from the other road condition data sources 110.” Para [0043] “The sensors 220, which can be implemented with the vehicle systems 218, can include various types of sensors for use with the vehicle 202 and/or the vehicle systems 218 for detecting and/or sensing a parameter of the vehicle 202, the vehicle systems 218, and/or the environment surrounding the vehicle 202. For example, the sensors 220 can capture and provide roadway data as discussed herein. The sensors 220 can include, but are not limited to: acceleration sensors, speed sensors, braking sensors, proximity sensors, vision sensors, ranging sensors, seat sensors, seat-belt sensors, door sensors, environmental sensors, yaw rate sensors, steering sensors, GPS sensors, among others. It is also understood that the sensors 220 can be any type of sensor, for example, acoustic, electric, environmental, optical, imaging, light, pressure, force, thermal, temperature, proximity, among others. In the embodiments discussed herein, the sensors 220 can be used to capture roadway data.” Para [0051] “At block 708, the method 700 includes transmitting the data to the roadway manager system 104. The data can include the roadway data, the condition, and/or the priority level along with other data related to the location, condition, priority level, etc. In some embodiments, the data includes data aggregated and/or analyzed by the road condition monitoring system 102.” Para [0042-0043] discuss how the image sensor used is the one for an autonomous driving system and therefore the images are designed to be captured while the vehicle is in motion. Additionally video capture is the common data captured by autonomous vehicles for autonomous driving. } use the vehicle-view moving image {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” } determining a degree of agreement between the image-recognized lane lines and lane lines defined in a database at intervals {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be due to weather conditions and/or roadway conditions. For example, current weather can affect visibility of the lane lines, obstructing road debris can affect visibility of the lane lines, and road degradation can affect visibility of the lane lines. In another embodiment, a roadway can be classified according to a road type of new or old based on the visibility of the lane lines on that roadway.” Para [0073] “As mentioned above, the roadway data, classifications and/or priority levels can be communicated to the roadway manager system server 208 and/or this information can be automatically updated. The communication and/or update timing can be on-demand, periodic, contingent or particular criteria, and/or any combination of the aforementioned. For example, if the roadway manager system 104 is interested in a particular road segment due to weather conditions and/or roadway conditions, the road condition monitoring system 102 can receive, determine and transmit information about the road segment on-demand. In another embodiment, the information can be transmitted at a predetermined time period, for example, once a week, once a month.” Where on demand can be considered a real-time interval but also non real time intervals are contemplated though not explicitly for the determination step. } wherein the controller determines the visibility of the lane lines by comparing the locations of the lane lines detected and the locations of lane lines defined in a database. {para [0067] “Using roadway data, the road condition monitoring system can monitor lane lines, classify the lane lines, the severity of degradation and/or the level of visibility, and report the classifications to a roadway manager and/or control maintenance of lane lines, road maintenance, and/or roadway design. For example, image data of the lane lines (e.g., roadway data) can be compared to baseline data about the roadway and/or lane lines. Thus, the road condition monitoring system 102 can determine whether there is a discrepancy between the current image data of the lane lines and the baseline data (e.g., what is there vs. what should be there with respect to lane lines).” Para [0068] “As an illustrative example, the roadway data can indicate that a single yellow line is present on a roadway while stored map data can indicate that a double yellow line should be present on said roadway. This discrepancy is identified by the road condition monitoring system and used to classify the lane line degradation, for example, by a measurement of visibility of the lane line. The measurement of visibility can be d
Read full office action

Prosecution Timeline

Feb 10, 2023
Application Filed
Oct 05, 2024
Non-Final Rejection — §103
Jan 09, 2025
Response Filed
Jan 29, 2025
Final Rejection — §103
Mar 18, 2025
Examiner Interview Summary
Mar 18, 2025
Applicant Interview (Telephonic)
Mar 26, 2025
Request for Continued Examination
Mar 28, 2025
Response after Non-Final Action
Apr 12, 2025
Non-Final Rejection — §103
May 30, 2025
Examiner Interview (Telephonic)
May 30, 2025
Examiner Interview Summary
Jul 08, 2025
Response Filed
Oct 01, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589770
SAFETY CONTROLLER FOR AUTOMATED DRIVING
2y 5m to grant Granted Mar 31, 2026
Patent 12570148
ACCESSORY MANAGEMENT SYSTEM THAT IDENTIFIES ACCESSORIES TO ALLOW FOR CONNECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12552253
VEHICLE AND A METHOD OF CONTROLLING A DISPLAY TO OUTPUT A VISUAL INDICATION FOR INDUCING SELECTION OF A SPECIFIC DRIVING MODE
2y 5m to grant Granted Feb 17, 2026
Patent 12534132
SYSTEM AND METHOD FOR PROVIDING A VISUAL AID FOR STEERING ANGLE OFFSET IN A STEER-BY-WIRE SYSTEM
2y 5m to grant Granted Jan 27, 2026
Patent 12522245
COMPUTER-IMPLEMENTED METHOD FOR MANAGING AN OPERATIONAL DESIGN DOMAIN'S EXPANSION FOR AN AUTOMATED DRIVING SYSTEM
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
72%
Grant Probability
94%
With Interview (+22.6%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 137 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month