Prosecution Insights
Last updated: April 19, 2026
Application No. 18/641,239

SYSTEMS AND METHODS FOR VEHICLE BEHAVIOR MONITORING AND QUANTIFICATION

Final Rejection §103
Filed
Apr 19, 2024
Examiner
ARTIMEZ, DANA FERREN
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Harman International Industries, Incorporated
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
46 granted / 80 resolved
+5.5% vs TC avg
Strong +44% interview lift
Without
With
+43.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
42 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
19.0%
-21.0% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 80 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art. Status of the Claims This is a Final Office Action in response to Applicant’s amendment of 24 December 2025. Claims 1-20 are pending and have been considered as follows. Information Disclosure Statement The information disclosure statement (IDS) filed on 10/07/2025 is being considered by the examiner. Response to Amendment and/or Argument Applicant’s amendments and/or arguments with respect to the Claim Rejections of Claims 1-20 under 35 U.S.C. 101 as set forth in the office action 24 September 2025 have been considered and are persuasive. Therefore, the Claim Rejections of Claims 1-20 under 35 U.S.C. 101 as set forth in the office action 24 September 2025 have been withdrawn. Applicant’s amendments and/or arguments with respect to the Claim Rejections of Claims 1-20 under 35 U.S.C. 112(b) as set forth in the office action 24 September 2025 have been considered and are persuasive. Therefore, the Claim Rejections of Claims 1-20 under 35 U.S.C. 112(b) as set forth in the office action 24 September 2025 have been withdrawn. Applicant’s arguments with respect to claim(s) 1 and 12 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Applicant’s amendments and/or arguments with respect to the Claim Rejection of Claim 16 under 35 USC 103 as set forth in the office action of 09/24/2025 have been considered and are NOT persuasive. Applicant asserts (Pages 24-27 of Applicant’s Remarks Dated 12/24/2025) that the cited prior art Chaves and Zhang fails to teach: calculating separate driving scores for vehicle itself, vehicle-other vehicle interaction, vehicle-road user interaction, vehicle-road infrastructure interaction and then calculating a vehicle behavior score as a weighted sum of those factors; and that the specific scoring structure is not disclosed/suggested by the cited reference. The Examiner’s Response: The examiner has carefully considered Applicant’s arguments and respectfully disagrees. The disclosure of Chaves itself describes a detection environment in which behavior data regarding vehicle 120 is obtained from multiple different sources representing distinct interaction contexts. For example, the specification explains that the detection system 110 may receive behavior data captured by vehicle 120, other vehicles 160, pedestrians 130, roadside sensors 140, roadside units 150, mobile computing devices of other users 190, each capable of observing and capturing information indicative of the driving behavior of the vehicle 120. These disclosures indicate that the system obtains behavior information from sources corresponding to (i) the vehicle itself, (ii) other vehicles, (iii) road users such as pedestrians, and (iv) roadside infrastructure. Because the specification teaches collecting behavior data about the vehicle from these distinct interaction sources, one of ordinary skill in the art would reasonably understand that the driving behavior of the vehicle may be evaluated separately with respect to each interaction context (e.g., vehicle self-behavior, interactions with other vehicles, interactions with road users, and interactions with infrastructure). For example, Chaves’ disclosure of “[0084] The other vehicles 160 may include autonomous vehicle…a position or vantage point… to observe the driving behavior of the vehicle 120…sensors …from which the driving behavior of the vehicle 120 can be observed or determined. [0085]..The captured images…sent to the detection system 110 to determine whether the vehicle exhibited one or more dangerous driving attributes…”, suggests evaluating behavior based on vehicle-other vehicle interactions, since another vehicle observes the driving conduct of the vehicle 120; “[0075] The pedestrian 130 may …observe the driving behavior of the vehicle 120…the pedestrian may use mobile computing device to capture video…the captured vehicle may be sent to the detection system …determine whether the vehicle…run the stop sign…” describes behavior data generated from interactions between the vehicle and road users (e.g., a vehicle running a stop sign near a pedestrian); “[0076] The road-side sensors 140 may…provide information from which the location, velocity, direction of travel or orientation of the vehicle can be derived…[0082]… RSU 150 can be used to determine the location of vehicle…” describes roadside sensors capturing behavior of the vehicle relative to infrastructure, suggesting evaluation of vehicle-infrastructure interaction behavior. The specification ([0126])further teaches determining driving scores for a plurality of attributes or categories of driving behaviors such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles; thereby indicating that the system evaluates behavior through multiple distinct behavior metrics derived from observed driving events. The specification describes assigning weighting values to individual driving scores and determining an overall driving score based on those weighted scores. In view of the teachings that the system calculates multiple behavior-specific driving scores and applies weighting values to produce an overall score. It would have been obvious to a person having ordinary skill in the art to organize such scores according to the interaction context from which the behavior data is obtained (e.g. vehicle’s self-operation, interaction with other vehicles, interaction with road users, and interactions with infrastructure) and to combine those scores using weighted aggregation to determine an overall vehicle behavior score. Accordingly, Applicant’s argument regarding determining four interaction specific scores and calculating a vehicle behavior score as a weighted sum of those scores argument is NOT persuasive. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-5, 8 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Chaves et al. (US 2022/0355802 A1 hereinafter Chaves) in view of Wang et al. (US 2024/0317242 A1 hereinafter Wang). Regarding Claim 1, Chaves teaches A method for an edge computing device (see at least Abstract Fig. 1 & 4), comprising: receiving, at an edge computing device, vehicle data from a vehicle via a communication link between the vehicle and the edge computing device; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The detection system may identify each occurrence of the vehicle 120 ignoring a traffic sign or violating a traffic law during the time period, or exhibiting one or more dangerous driving attributes (e.g. braking more than a number of instances, swerving within a lane, changing lanes more than a second number of instances, crossing multiple lane boundaries, flashing headlights at another vehicle, tailgating another vehicle, driving the vehicle less than a distance from another vehicle, or inattentiveness of a human driver of the vehicle.)) receiving, at the edge computing device, remote vehicle data from a remote vehicle via a communication link between the remote vehicle and the edge computing device; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The other vehicles 160 may include autonomous vehicles, semi-autonomous vehicles, and conventional vehicles that are in a position or vantage point from which to observe the driving behavior of the vehicle 120. For instances in which the other vehicle is an autonomous vehicle, one or more sensors of the autonomous vehicle can be used to capture data, such as images, video, audio, or generate 3D point clouds from which the driving behavior of the vehicle 120 can be observed or determined. For instances in which the other vehicle is equipped with computer vision, the computer vision may be used to observe and record the driving behavior of the vehicle 120 , at least while in range of the other vehicle.) receiving, at the edge computing device, road user data from a road user via a communication link between the road user and the edge computing device; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The pedestrian 130 may be any person or persons in a position or vantage point from which to observe the driving behavior of the vehicle 120 . For example, while standing on a sidewalk near an intersection, the pedestrian 130 may be in a position to witness the vehicle 120 running a stop sign at the intersection. In some instances, the pedestrian 130 may use the mobile computing device 190 to capture video of the vehicle 120 running the stop sign. The captured video may be sent to the detection system 110 . The captured video may be analyzed by the detection system 110 to determine whether the vehicle 120 did, in fact, run the stop sign.) receiving, at the edge computing device, road infrastructure data from a road infrastructure device via a communication link between the road infrastructure device and the edge computing device; (see at least Fig. 1, 4 [0068-0144]: The road-side units (RSUs) 150 may include any suitable wireless communication device that can relay wireless signals between one another, the detection system 110 , and/or the vehicle 120 . The RSUs 150 may have fixed locations known to the detection system 110 , and can be used to determine the position, velocity, and direction of the vehicle 120 at different instances in time or at different locations. The RSUs 150 may be configured to perform ranging operations with the vehicle 120 . For example, the distance between a respective RSU 150 and the vehicle 120 may be determined based on the round-trip time (RTT) of a signal exchanged between the respective RSU 150 and the vehicle 120 . The distances between the vehicle 120 and each of three or more RSUs 150 having known locations can be used to determine the precise location of the vehicle 120 using well-known trilateration techniques.) executing instructions stored in a non-transitory memory of the edge computing device via a processor of the edge computing device to calculate a plurality of driving scores using the vehicle data, the remote vehicle data, the road user data, and the road infrastructure data, after excluding the identified anomalies, and calculate the vehicle behavior score of the vehicle using the plurality of driving scores; (see at least Fig. 1, 4 [0068-0144]: The driving score engine 466 may determine more than one driving score for a respective vehicle 120. That is, in some implementations, the driving score engine 466 may determine a driving score for each of a plurality of different driving attributes or categories. For example, the driving score engine 466 may determine driving scores for driving attributes such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles. In some instances, the driving score engine 466 may assign a weighting value to each of the individual driving scores, and determine an overall driving score for the vehicle based on the weighted individual driving scores. The driving score engine 466 may determine the relative impact of each individual driving score on the overall driving score by selecting and/or adjusting the weighting values assigned to the individual driving scores. In this way, the driving score engine 466 may place greater emphasis on some driving scores (e.g., excessive speeding) than on other driving scores (e.g., frequent braking).) comparing the vehicle behavior score to a minimum vehicle behavior score threshold and, in response to the vehicle behavior score being less than the minimum vehicle behavior score threshold, identifying one or more driving scores from the plurality of driving scores that are less than a corresponding minimum threshold driving score; (see at least Fig. 1-15 [0068-0167]: the detection system 110 may observe a driving behavior of the vehicle 120 during a time period. The detection system 110 may determine one or more driving scores for the vehicle 120 based on the observed driving behavior. In some aspects, the driving scores may be compared with one or more threshold values to determine whether the driving behavior of the vehicle 120 is unsafe or unsatisfactory. For example, if one or more of the driving scores are greater than one or more corresponding threshold values, the detection system 110 may determine that the vehicle 120 is driving in an unsatisfactory manner. In response to the determination of unsatisfactory driving, the detection system 110 may generate an indication of unsatisfactory driving behavior of the vehicle 120 . The driving behavior engine 464 may determine whether the vehicle 120 ignored a particular traffic law by correlating the observed driving behavior of the vehicle 120 with the expected driving behavior corresponding to the particular traffic law. In some aspects, the driving behavior engine 464 may determine the expected driving behavior associated with the particular traffic law, and correlate the observed driving behavior of the vehicle 120 with the expected driving behavior to determine the level of compliance with, or the level of deviation from, the particular traffic law. The determined level of compliance or deviation may be compared with a corresponding threshold value to determine whether the vehicle 120 violated the particular traffic law.) generating and outputting one or more control signals that correspond to the one or more driving scores identified as being less than the corresponding minimum threshold driving score, wherein the one or more control signals are configured to adjust operation of the vehicle; (see at least Fig. 1-15 [0068-0167]: The detection system 400 may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value. The one or more operations may be selected to incentivize the vehicle (or its human driver) to improve its driving behavior, and thus improve its driving score, by exhibiting a safer driving behavior, obeying traffic signs, and complying with traffic laws. In some implementations, the one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof.) and outputting and storing the vehicle behavior score. (see at least Fig. 1-15 [0068-0167]: The detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 . As discussed, the one or more third-party entities may include (but are not limited to) a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles. The one or more processors 410 may execute the reporting program 416 to generate a report indicating the number of identified occurrences of each dangerous driving attribute exhibited by the vehicle 120 during a certain time period and/or within a certain geographic area. In some instances, the detection system 400 may provide the report to one or more of the third-party entities 170 .) it may be alleged that Chaves does not explicitly teach identifying anomalies in at least one of the vehicle data, the remote vehicle data, the road user data, and the road infrastructure data using a trained machine learning algorithm, wherein the trained machine learning algorithm is trained using a labeled dataset, and wherein the anomalies comprise behaviors occurring below a threshold frequency; excluding the identified anomalies from calculation of a vehicle behavior score; Wang is directed to system and method for filtering vehicle data before training and/or running a prediction model, Wang teaches identifying anomalies in at least one of the vehicle data, the remote vehicle data, the road user data, and the road infrastructure data using a trained machine learning algorithm, wherein the trained machine learning algorithm is trained using a labeled dataset, and wherein the anomalies comprise behaviors occurring below a threshold frequency; excluding the identified anomalies from calculation of a vehicle behavior score; (see at least Fig. 1 [0046-0066]: The extracted features may then be filtered by the anomaly filter 126 and working condition filter 128 so as to generate a filtered valid dataset 130 . These filters 126 , 128 operate based on the understanding that not all data can be used for modeling a vehicle driving process. The working condition filter 128 , which will be described in greater detail hereafter, operates to filter out data that corresponds to the vehicle conditions that cannot be used for vehicle weight prediction, e.g., if all five conditions are not met, then the corresponding low frequency data or extracted features may be filtered out of the useable dataset when generating the filtered valid data 130 , i.e., data corresponding to periods of operation of the vehicle where these 5 conditions are not satisfied is considered to be invalid data. With regard to the anomaly data filter 126 , for example, in some illustrative embodiments, an Elliptic envelope based filter may be used to identify data within a specified ellipse, which is considered normal data, and data outside the specified ellipse, which is considered anomaly data. The anomaly data filter 126 may be applied before, after, or at substantially a same time as the working condition filter 126 , so as to generate a filtered up-sampled vehicle operation dataset. The filtered valid dataset 130 may be pre-processed by the AI model pre-processor 140 prior to input to the AI computer models 150 , 160 in order to minimize the error of machine learning processes, e.g., linear regression, by expanding the filtered valid dataset 130. That is, the system discloses filtering low-frequency or invalid vehicle data and excluding it from the dataset used for modeling and subsequent computation.) Accordingly, it would have been obvious to one person of ordinary skill in the art (POSITA) before the effective filing date of the claimed invention to have modified Chaves’s system and method for determining whether a vehicle is driving in unsafe or unsatisfactory manner to incorporate the teachings of Wang for filtering anomalous or low-frequency vehicle data from a dataset prior to computing a vehicle behavior score. A POSITA would have been motivated to make such a modification with reasonable expectation of success because Wang teaches that filtering invalid or low-frequency operating data produces a filtered valid dataset suitable for machine-learning analysis and incorporating such filtering techniques into the system and method of Chaves would improve accuracy and reliability of the driving score by preventing anomalous or invalid vehicle data from skewing model results. Regarding Claim 2, the combination of Chaves in view of Wang teaches The method of claim 1, Chaves further teaches wherein calculating the vehicle behavior score further comprises identifying vehicle events and identifying events of vehicle interaction with each of the remote vehicle, the road user, and the road infrastructure device using the vehicle data and the remote vehicle data, the road user data, and the road infrastructure data, respectively. (see at least Fig. 1-15 [0068-0167]: the detection system 110 may determine whether the vehicle 120 ignored one or more traffic signs or violated one or more traffic laws during the time period. In some instances, the detection system 110 may identify each occurrence of the vehicle 120 ignoring a traffic sign or violating a traffic law. The detection system 110 may generate at least one of the driving scores based on the number of identified occurrences of ignoring traffic signs or violating traffic laws during the time period. In other implementations, the detection system 110 may determine whether the vehicle 120 exhibits one or more dangerous driving attributes during the time period. In some instances, the dangerous driving attributes may include (but are not limited to) braking more than a number of instances, swerving within a lane, changing lanes more than a second number of instances, crossing multiple lane boundaries, flashing headlights at another vehicle, tailgating another vehicle, driving the vehicle less than a distance from another vehicle, or an inattentiveness of a human driver of the vehicle. As such, one or more of the driving scores may be based at least in part on the detection of one or more dangerous driving attributes exhibited by the vehicle 120 during the time period.) Regarding Claim 3, the combination of Chaves in view of Wang teaches The method of claim 2, Chaves further teaches wherein the calculating the plurality of driving scores comprises applying a weight to the vehicle events and each event of vehicle interaction, and calculating a sum of the weighted vehicle events and the weighted events of vehicle interaction, the plurality of driving scores comprises a driving score for the vehicle and a driving score for interactions of the vehicle with each of the remote vehicle, the road user, and the road infrastructure device. (see at least Fig. 1-15 [0068-0167]: the driving score engine 466 may determine a driving score for each of a plurality of different driving attributes or categories. For example, the driving score engine 466 may determine driving scores for driving attributes such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles. In some instances, the driving score engine 466 may assign a weighting value to each of the individual driving scores, and determine an overall driving score for the vehicle based on the weighted individual driving scores. The driving score engine 466 may determine the relative impact of each individual driving score on the overall driving score by selecting and/or adjusting the weighting values assigned to the individual driving scores. In this way, the driving score engine 466 may place greater emphasis on some driving scores (e.g., excessive speeding) than on other driving scores (e.g., frequent braking).) Regarding Claim 4, the combination of Chaves in view of Wang teaches The method of claim 3, wherein calculating the vehicle behavior score comprises: Chaves further teaches calculating a weighted sum of the driving score for vehicle-remote vehicle interaction, the driving score for vehicle-road user interaction, the driving score for vehicle-road infrastructure device interaction, and the driving score for the vehicle. (see at least Fig. 1-15 [0068-0167]: the driving score engine 466 may determine a driving score for each of a plurality of different driving attributes or categories. For example, the driving score engine 466 may determine driving scores for driving attributes such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles. In some instances, the driving score engine 466 may assign a weighting value to each of the individual driving scores, and determine an overall driving score for the vehicle based on the weighted individual driving scores. The driving score engine 466 may determine the relative impact of each individual driving score on the overall driving score by selecting and/or adjusting the weighting values assigned to the individual driving scores. In this way, the driving score engine 466 may place greater emphasis on some driving scores (e.g., excessive speeding) than on other driving scores (e.g., frequent braking).) Regarding Claim 5, the combination of Chaves in view of Wang teaches The method of claim 4, Chaves further teaches wherein calculating the vehicle behavior score further comprises calculating the weighted sum of traffic violations and alerts of the vehicle. (see at least Fig. 1-15 [0068-0167]: the driving score engine 466 may determine a driving score for each of a plurality of different driving attributes or categories. For example, the driving score engine 466 may determine driving scores for driving attributes such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles. In some instances, the driving score engine 466 may assign a weighting value to each of the individual driving scores, and determine an overall driving score for the vehicle based on the weighted individual driving scores. The driving score engine 466 may determine the relative impact of each individual driving score on the overall driving score by selecting and/or adjusting the weighting values assigned to the individual driving scores. In this way, the driving score engine 466 may place greater emphasis on some driving scores (e.g., excessive speeding) than on other driving scores (e.g., frequent braking).) Regarding Claim 8, the combination of Chaves in view of Wang teaches The method of claim 1, Chaves further teaches wherein the control signals are configured to adjust operation of the vehicle by implementing a maximum speed of the vehicle. (see at least Fig. 1-15 [0068-0167]: the detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 wherein the one or more third-party entities may include (but are not limited to) a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles. The detection system 400 may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value. The one or more operations may be selected to incentivize the vehicle (or its human driver) to improve its driving behavior, and thus improve its driving score, by exhibiting a safer driving behavior, obeying traffic signs, and complying with traffic laws. In some implementations, the one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof.) Regarding Claim 10, the combination of Chaves in view of Wang teaches The method of claim 1, Chaves further teaches wherein the vehicle behavior score is output and stored in association with an identity of an operator of the vehicle as an operator behavior score. (see at least Fig. 1-15 [0068-0167]: In one implementation, an entity associated with a human driver of a vehicle having an unsatisfactory driving score may participate in selecting the limits and restrictions placed on the vehicle (or its human driver). For example, if a teenager is determined to exhibit unsatisfactory driving behavior, the detection system 400 may notify the teenager's parents of the unsatisfactory driving behavior and solicit suggestions regarding which operations or features of the vehicle should be disabled, which operations or features of the vehicle should be restricted or limited, and which operations or features of the vehicle should be maintained in their current states.) Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over Chaves in view of Wang and Vakeesar et al. (US 2023/0422142 A1 hereinafter Vakeesar). Regarding Claim 6, the combination of Chaves in view of Wang teaches The method of claim 1, It may be alleged that the combination of Chaves in view of Wang does not explicitly teach wherein a maximum latency from receiving the vehicle data to outputting the vehicle behavior score is 30 milliseconds. Vakeesar is directed to field of communication networks relating to routing data traffic for mobile edge computing device in wireless communication networks, Vakeesar teaches wherein a maximum latency from receiving the vehicle data to outputting the vehicle behavior score is 30 milliseconds. (see at least [0051-0087]: Cooperative maneuvers, for example, in the case of emergency trajectory alignment between UEs supporting a V2X application, may require that the communication system supports message exchange with a maximum end-to-end latency of 3 ms between a host vehicle and a remote vehicle, when the vehicles are driven at an absolute speed of up to 130 km/h, while supporting data rate of 30 Mbps. This may enable an application server to process data from one or more remote vehicles sufficiently quickly before passing the processed data to an appropriate host vehicle. The application server may decide which data from which remote vehicles are relevant to which host vehicle. These requirements demonstrate the benefits of MEC in ensuring high reliability and high availability of MEC-V2X application services. In the case of, for instance, cooperative collision avoidance, in order to meet high-speed and low-latency requirements, it may be therefore beneficial to have an application server located locally (i.e., not remotely). Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Chaves and Wang incorporate the technique of low-latency decision-making for safety critical cooperative maneuvers as taught by Vakeesar with reasonable expectation of success to ensure high reliability, high speed and low-latency that ensure safety-critical computations are not delayed and doing so enhance vehicle operation safety and reliability. Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Chaves in view of Wang and Clement et al. (US 2020/0065711 A1 hereinafter Clement). Regarding Claim 7, the combination of Chaves in view of Wang teaches The method of claim 1, It may be alleged that the combination of Chaves in view of Wang does not explicitly teach wherein the labeled dataset comprises multiple simulated scenarios including vehicle events and events of vehicle interaction, each labeled as anomalous or not anomalous. Clement is directed to system and methods for detecting and recording anomalous vehicle events, Clement teaches wherein the labeled dataset comprises multiple simulated scenarios including vehicle events and events of vehicle interaction, each labeled as anomalous or not anomalous. (see at least Fig. 3 [0032-0039]: The training databases 342 and 344 may include contextual data covering a large number of normal events and a large number of anomalous events, respectively. The normal events may include operations that are consistent with predictions based on historical data. The operations related to normal events may be predictable by the prediction model of the vehicle (e.g., within a threshold to the predicted operations). The training databases 342 and 344 may include an initial data set of normal and anomalous events which are labeled by human and/or another data set of normal and anomalous events automatically classified by machine-learning models. The training data may be constructed and optimized by weighting normal operation data and edge-case data differently, since edge-case data are typically sparse relative to normal operation data.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Chaves and Wang incorporate the technique of labeling training data set of normal and anomalous events as taught by Clement with reasonable expectation of success so that the vehicle models can learn to distinguish safe driving behavior from anomalies, leading to more accurate prediction results. Claim(s) 9 is rejected under 35 U.S.C. 103 as being unpatentable over Chaves in view of Wang and Bryer et al. (US 10,445,758 B1 hereinafter Bryer). Regarding Claim 9, the combination of Chaves in view of Wang teaches The method of claim 1, it may be alleged that the combination of Chaves in view of Wang does not explicitly teach wherein the vehicle behavior score is calculated in response to determination that a trip of the vehicle has ended. Bryer is directed to system and method for providing rewards based on driving behaviors detected by a mobile computing device, Bryer teaches wherein the vehicle behavior score is calculated in response to determination that a trip of the vehicle has ended. (see at least Fig. 9 Col. 33 Line 22 – Col. 36 Line 65: For example, the overall trip score may be an average or weighted average of the individual scores for the driving performance metrics. The scoring engine 932 may be configured to utilize respective weights for the individual scores of the driving performance metrics, and such weights may be configurable at the scoring engine. The scoring engine 932 may be configured to determine the trip scores in response to a determination that a trip is complete. The trip information may be provided as input to the scoring engine 932, and the scoring engine may provide a set of scores as output in response.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Chaves and Wang incorporate the technique of determining the trip score in response to a determination that a trip is completed as taught by Bryer with reasonable expectation of success to avoid incomplete or inaccurate scoring because trip end scoring allows for batch processing on edge devices and allows the system to consider cumulative metrics that determines how many total number of harsh braking or speeding as a percentage of total trip duration. Claim(s) 12-15 are rejected under 35 U.S.C. 103 as being unpatentable over Chaves in view of Okumura et al. (US 2016/0139594 A1 hereinafter Okumura). Regarding Claim 12, Chaves teaches A method for a vehicle (see at least Abstract Fig. 1 & 4), comprising: receiving a vehicle behavior score and a vehicle control signal from an edge computing device, the vehicle behavior score computed based on vehicle data and at least one of road infrastructure data, remote vehicle data, and road user data; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The detection system may identify each occurrence of the vehicle 120 ignoring a traffic sign or violating a traffic law during the time period, or exhibiting one or more dangerous driving attributes (e.g. braking more than a number of instances, swerving within a lane, changing lanes more than a second number of instances, crossing multiple lane boundaries, flashing headlights at another vehicle, tailgating another vehicle, driving the vehicle less than a distance from another vehicle, or inattentiveness of a human driver of the vehicle.) The detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 . As discussed, the one or more third-party entities may include (but are not limited to) a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles. The detection system 400 may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value.) outputting a notification indicating that the vehicle control signal has been received; (see at least [0013, 0022, 0130-0150]: The detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 (e.g. a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles) and may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value. For example, if a teenager is determined to exhibit unsatisfactory driving behavior, the detection system 400 may notify the teenager's parents of the unsatisfactory driving behavior and solicit suggestions regarding which operations or features of the vehicle should be disabled, which operations or features of the vehicle should be restricted or limited, and which operations or features of the vehicle should be maintained in their current states.) receiving a user input accepting the vehicle control signal; (see at least Fig. 1-4 [0074, 0130-0145]: The detection system 400 may limit one or more operations of the vehicle (the vehicle may be an autonomous vehicle) in response to determining that at least one driving score exceeds the threshold value. he one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof.) in response to receiving the user input, implementing the vehicle control signal, wherein the implementing the vehicle control signal comprises adjusting operation of a vehicle; and storing the vehicle behavior score. (see at least Fig. 1-4 [0074, 0130-0145]: The detection system 400 may limit one or more operations of the vehicle (the vehicle may be an autonomous vehicle) in response to determining that at least one driving score exceeds the threshold value. he one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof.) Although Chaves teaches notifying the driver of unsatisfactory scores and limiting vehicle functions, it does not explicitly disclose obtaining (human) user acceptance before restricting vehicle functions, namely, receiving a user input accepting the vehicle control signal; in response to receiving the user input, implementing the vehicle control signal, wherein the implementing the vehicle control signal comprises adjusting operation of a vehicle; Okumura is directed remote operation of autonomous vehicle in unexpected environment, Okumura teaches receiving a user input accepting the vehicle control signal; in response to receiving the user input, implementing the vehicle control signal, wherein the implementing the vehicle control signal comprises adjusting operation of a vehicle; (see at least Fig. 2 [0017-0039]: In one example implementation, before the remote operator is contacted, the driver can be prompted and/or asked, for example, using the interactive display, the audio system, or other vehicle interfaces 118, to confirm or approve granting control to the remote operator. If the driver responds affirmatively and elects to initiate remote operation mode, then the remote operator can be contacted, the relevant data captured by the sensors 130 can be sent to the remote operator, and control of the vehicle 200 can pass to the remote operator. If, on the other hand, the driver does not elect to contact the remote operator, then the vehicle 200 can switch to manual mode and the driver can take control of the vehicle 200 and vehicle systems 116. Alternatively, the vehicle 200 can be configured so that the driver is given the opportunity to affirmatively elect to retain control of the vehicle 200 and enter it into manual mode, and if the driver does not so elect within a predefined number of seconds, then the remote operator is contacted and control of the vehicle 200 passes automatically to the remote operator.) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings such that, after notifying the user of the vehicle (e.g. can be autonomous, semi-autonomous, manual) as in Chaves, the system would prompt the driver to confirm or approve granting control to a remote operator before remote intervention occurs as taught by Okumura, particularly in semi-autonomous or manually-assisted modes, with reasonable expectation of success, to ensure safety and proper human oversight. Regarding claim 13, the combination of Chaves in view of Okumura teaches The method of claim 12, Chaves further teaches wherein adjusting operation of the vehicle system comprises implementing a maximum speed of the vehicle when the vehicle is within a predetermined range of a road user. (see at least Fig. 1-15 [0068-0167]: The one or more operations may be selected to incentivize the vehicle (or its human driver) to improve its driving behavior, and thus improve its driving score, by exhibiting a safer driving behavior, obeying traffic signs, and complying with traffic laws. In some implementations, the one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof. The detection system may inform that the respective vehicle is limited to a top speed of 50 mph and is not permitted to user interstate highway.) Regarding Claim 14, the combination of Chaves in view of Okumura teaches The method of claim 13, Chaves further teaches wherein implementing the vehicle control signal further comprises outputting a notification. (see at least [0013, 0022, 0130-0150]: The detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 (e.g. a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles) and may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value. For example, if a teenager is determined to exhibit unsatisfactory driving behavior, the detection system 400 may notify the teenager's parents of the unsatisfactory driving behavior and solicit suggestions regarding which operations or features of the vehicle should be disabled, which operations or features of the vehicle should be restricted or limited, and which operations or features of the vehicle should be maintained in their current states.) Regarding Claim 15, the combination of Chaves in view of Okumura teaches The method of claim 12, Chaves further teaches wherein adjusting operation of the vehicle system comprises automatically adjusting a distance o the vehicle from a road user. (see at least [0165]: The one or more operations may include limiting a speed of the vehicle or limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, disabling or limiting one or more features of an infotainment system of the vehicle, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, restricting travel of the vehicle to certain areas or along certain routes, requiring the vehicle to increase spacings between the vehicle and other vehicles, disabling the vehicle for period of time after the vehicle arrives at a destination, or any combination thereof.) Claim(s) 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over Chaves. Regarding Claim 16, Chaves teaches A system (see at least Abstract Fig. 1 & 4), comprising: a vehicle having a sensor subsystem configured to capture information about at least one of a vehicle behavior and an operator behavior, the vehicle further configured to adjust operation of a vehicle system in response to receiving a vehicle control signal; (see at least Fig. 1-2 [0068-0160]: The sensors 240 may include any suitable sensors or devices that can be used, individually or in conjunction with one another, to scan a surrounding environment for objects, other vehicles, roads, road conditions, traffic signs, traffic lights, weather conditions, environmental features, buildings, hazardous conditions, and other attributes, characteristics, or features of the surrounding environment. The vehicle controller 230 may interface with the autonomous vehicle's control system 210 , and may be used to control various operations of the autonomous vehicle 200 including (but not limited to) assuming control of the autonomous vehicle 200 , providing instructions to the autonomous vehicle 200 , configuring the autonomous vehicle 200 for passenger service, disabling the autonomous vehicle 200 , restricting one or more operations of the autonomous vehicle 200 , and limiting one or more driving metrics of the autonomous vehicle 200 . For example, in some instances, the vehicle controller 230 may be used to limit one or more of a maximum speed of the autonomous vehicle 200 , a driving distance of the autonomous vehicle 200 , and so on.) and an edge computing device communicably coupled to the vehicle, the edge computing device including a processor and a non-transitory memory storing executable instructions that (see at least Fig. 1-4), when executed, cause the processor to: receiving vehicle data from a vehicle; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The detection system may identify each occurrence of the vehicle 120 ignoring a traffic sign or violating a traffic law during the time period, or exhibiting one or more dangerous driving attributes (e.g. braking more than a number of instances, swerving within a lane, changing lanes more than a second number of instances, crossing multiple lane boundaries, flashing headlights at another vehicle, tailgating another vehicle, driving the vehicle less than a distance from another vehicle, or inattentiveness of a human driver of the vehicle.)) receiving remote vehicle data from a remote vehicle; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The other vehicles 160 may include autonomous vehicles, semi-autonomous vehicles, and conventional vehicles that are in a position or vantage point from which to observe the driving behavior of the vehicle 120. For instances in which the other vehicle is an autonomous vehicle, one or more sensors of the autonomous vehicle can be used to capture data, such as images, video, audio, or generate 3D point clouds from which the driving behavior of the vehicle 120 can be observed or determined. For instances in which the other vehicle is equipped with computer vision, the computer vision may be used to observe and record the driving behavior of the vehicle 120 , at least while in range of the other vehicle.) receiving road user data from a road user; (see at least Fig. 1, 4 [0068-0144]: The driving behavior detection system 400 includes one or more transceivers 430 used to transmit and receive information to and from the one or more other devices, systems, or entities. For example, the transceivers 430 may facilitate the exchange of communications (such as signals and messages) between the vehicle 120 , the pedestrians 130 , the road-side sensors 140 , the RSUs 150 , the one or more other vehicles 160 , the one or more entities 170 , and the mobile computing devices 190. The pedestrian 130 may be any person or persons in a position or vantage point from which to observe the driving behavior of the vehicle 120 . For example, while standing on a sidewalk near an intersection, the pedestrian 130 may be in a position to witness the vehicle 120 running a stop sign at the intersection. In some instances, the pedestrian 130 may use the mobile computing device 190 to capture video of the vehicle 120 running the stop sign. The captured video may be sent to the detection system 110 . The captured video may be analyzed by the detection system 110 to determine whether the vehicle 120 did, in fact, run the stop sign.) receiving road infrastructure data from a road infrastructure device; (see at least Fig. 1, 4 [0068-0144]: The road-side units (RSUs) 150 may include any suitable wireless communication device that can relay wireless signals between one another, the detection system 110 , and/or the vehicle 120 . The RSUs 150 may have fixed locations known to the detection system 110 , and can be used to determine the position, velocity, and direction of the vehicle 120 at different instances in time or at different locations. The RSUs 150 may be configured to perform ranging operations with the vehicle 120 . For example, the distance between a respective RSU 150 and the vehicle 120 may be determined based on the round-trip time (RTT) of a signal exchanged between the respective RSU 150 and the vehicle 120 . The distances between the vehicle 120 and each of three or more RSUs 150 having known locations can be used to determine the precise location of the vehicle 120 using well-known trilateration techniques.) identify vehicle events and events of vehicle interaction with each of the remote vehicle, the road user, and the road infrastructure device using the vehicle data, the remote vehicle data, the road user data, and the road infrastructure data; (see at least Fig. 1-4 [0068-0144]: The detection system 110 may determine one or more driving scores for the vehicle 120 based on the observed driving behavior. The detection system 110 may determine whether the vehicle 120 ignored one or more traffic signs or violated one or more traffic laws during the time period. In some instances, the detection system 110 may identify each occurrence of the vehicle 120 ignoring a traffic sign or violating a traffic law. In some instances, the dangerous driving attributes may include (but are not limited to) braking more than a number of instances, swerving within a lane, changing lanes more than a second number of instances, crossing multiple lane boundaries, flashing headlights at another vehicle, tailgating another vehicle, driving the vehicle less than a distance from another vehicle, or an inattentiveness of a human driver of the vehicle. The pedestrian 130, other vehicles 160, roadside sensors 140, roadside units 150 and other users may observe and capture behavior of the vehicles and send information to the detection system) calculate a driving score for the vehicle and a driving score for each of the vehicle-remote vehicle interaction, vehicle-road user interaction, and vehicle-road infrastructure interaction by applying a weight to each vehicle event and event of vehicle interaction; calculate a vehicle behavior score as a weighted sum of the driving score for the vehicle, the driving score for vehicle-remote vehicle interaction, the driving score for vehicle-road user interaction, and the driving score for vehicle-road infrastructure interaction;(see at least Fig. 1, 4 [0068-0144]: The driving score engine 466 may determine more than one driving score for a respective vehicle 120. That is, in some implementations, the driving score engine 466 may determine a driving score for each of a plurality of different driving attributes or categories. For example, the driving score engine 466 may determine driving scores for driving attributes such as (but not limited to) traffic law violations, traffic sign violations, excessive speeds, vehicle lane management, driver attentiveness, frequent braking, tailgating, swerving, or insufficient distances maintained from other vehicles. In some instances, the driving score engine 466 may assign a weighting value to each of the individual driving scores, and determine an overall driving score for the vehicle based on the weighted individual driving scores. The driving score engine 466 may determine the relative impact of each individual driving score on the overall driving score by selecting and/or adjusting the weighting values assigned to the individual driving scores. In this way, the driving score engine 466 may place greater emphasis on some driving scores (e.g., excessive speeding) than on other driving scores (e.g., frequent braking).) comparing the vehicle behavior score to a minimum vehicle behavior score threshold; and (see at least Fig. 1-15 [0068-0167]: the detection system 110 may observe a driving behavior of the vehicle 120 during a time period. The detection system 110 may determine one or more driving scores for the vehicle 120 based on the observed driving behavior. In some aspects, the driving scores may be compared with one or more threshold values to determine whether the driving behavior of the vehicle 120 is unsafe or unsatisfactory. For example, if one or more of the driving scores are greater than one or more corresponding threshold values, the detection system 110 may determine that the vehicle 120 is driving in an unsatisfactory manner. In response to the determination of unsatisfactory driving, the detection system 110 may generate an indication of unsatisfactory driving behavior of the vehicle 120 . The driving behavior engine 464 may determine whether the vehicle 120 ignored a particular traffic law by correlating the observed driving behavior of the vehicle 120 with the expected driving behavior corresponding to the particular traffic law. In some aspects, the driving behavior engine 464 may determine the expected driving behavior associated with the particular traffic law, and correlate the observed driving behavior of the vehicle 120 with the expected driving behavior to determine the level of compliance with, or the level of deviation from, the particular traffic law. The determined level of compliance or deviation may be compared with a corresponding threshold value to determine whether the vehicle 120 violated the particular traffic law.) generating and outputting the vehicle behavior score and the vehicle control signals to the vehicle, the vehicle control signal configured to adjust operation of the vehicle system in response to the vehicle behavior score being less than the minimum vehicle behavior score threshold. (see at least Fig. 1-15 [0068-0167]: The detection system 400 may provide the indication of unsatisfactory driving to one or more of the third-party entities 170 . The detection system 400 may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value. The one or more operations may be selected to incentivize the vehicle (or its human driver) to improve its driving behavior, and thus improve its driving score, by exhibiting a safer driving behavior, obeying traffic signs, and complying with traffic laws. In some implementations, the one or more operations may include (but are not limited to) limiting a top speed of the vehicle, limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit, prohibiting the vehicle from using HOV lanes or toll lanes, precluding membership in a platoon, revoking membership in a platoon, disabling or restricting one or more features of a manual driving mode of the vehicle, disabling or restricting one or more features of an autonomous driving mode of the vehicle, or any combination thereof.) Although Chaves does not explicitly label the scores using the same terminology or structural organization as recited in the claims, the reference teaches determining multiple driving scores corresponding to different driving attributes and assigning weighting values to determine an overall driving score. In view of these teachings, it would have been obvious to organize the behavior metrics according to different interaction contexts and compute corresponding scores prior to combining them using weighted aggregation. Regarding Claim 17, Chaves teaches The system of claim 16, further comprising at least one of: Chaves further teaches a road user with a user device that is configured to capture the road user data and is communicably coupled to the edge computing device; (see at least Fig. 1-4 [0068-0144]: The pedestrian 130 may be any person or persons in a position or vantage point from which to observe the driving behavior of the vehicle 120 . For example, while standing on a sidewalk near an intersection, the pedestrian 130 may be in a position to witness the vehicle 120 running a stop sign at the intersection. In some instances, the pedestrian 130 may use the mobile computing device 190 to capture video of the vehicle 120 running the stop sign. The captured video may be sent to the detection system 110 . The captured video may be analyzed by the detection system 110 to determine whether the vehicle 120 did, in fact, run the stop sign. Although only one pedestrian 130 is shown in FIG. 1 for simplicity, the environment 100 may include any suitable number of pedestrians.) a road infrastructure device configured as and/or including one or more sensors configured to capture the road infrastructure data, the road infrastructure device communicably coupled to the edge computing device; (see at least Fig. 1-4 [0068-0144]: The road-side sensors 140 may be or may include any suitable device that can provide information from which the location, velocity, direction of travel, or orientation of the vehicle 120 can be derived. In some aspects, the road-side sensors 140 may include (but are not limited to) cameras, video recorders, RADAR devices, LIDAR devices, acoustic sensors, and so on. For example, a road-side sensor 140 equipped with a camera may capture images of a nearby road within the field of view (FOV) of the camera. The captured images may be sent to the detection system 110 . The detection system 110 may use the captured images to determine when the vehicle 120 passed through the camera's FOV. In some instances, the detection system 110 can analyze images captured by multiple road-side sensors 140 having known locations to determine the velocity of the vehicle 120 at one or more points along a particular route. ) and a remote vehicle including sensors configured to capture the remote vehicle data, the remote vehicle communicably coupled to the edge computing device. (see at least Fig. 1-4 [0068-0144]: The other vehicles 160 may include autonomous vehicles, semi-autonomous vehicles, and conventional vehicles that are in a position or vantage point from which to observe the driving behavior of the vehicle 120 . For instances in which the other vehicle is an autonomous vehicle, one or more sensors of the autonomous vehicle can be used to capture data, such as images, video, audio, or generate 3D point clouds from which the driving behavior of the vehicle 120 can be observed or determined. For instances in which the other vehicle is equipped with computer vision, the computer vision may be used to observe and record the driving behavior of the vehicle 120 , at least while in range of the other vehicle.) Regarding Claim 18, Chaves teaches The system of claim 17, wherein the edge computing device is further configured to: Chaves further teaches in response to the vehicle behavior score being less than the minimum vehicle behavior score threshold, identify one or more driving scores that are less than a corresponding minimum threshold driving score, wherein the vehicle control signal corresponds to at least one of the one or more driving scores that are less than the predetermined driving score. (see at least Fig. 1-15 [0068-0167]: the detection system 110 may observe a driving behavior of the vehicle 120 during a time period. The detection system 110 may determine one or more driving scores for the vehicle 120 based on the observed driving behavior. In some aspects, the driving scores may be compared with one or more threshold values to determine whether the driving behavior of the vehicle 120 is unsafe or unsatisfactory. For example, if one or more of the driving scores are greater than one or more corresponding threshold values, the detection system 110 may determine that the vehicle 120 is driving in an unsatisfactory manner. In response to the determination of unsatisfactory driving, the detection system 110 may generate an indication of unsatisfactory driving behavior of the vehicle 120 . The driving behavior engine 464 may determine whether the vehicle 120 ignored a particular traffic law by correlating the observed driving behavior of the vehicle 120 with the expected driving behavior corresponding to the particular traffic law. In some aspects, the driving behavior engine 464 may determine the expected driving behavior associated with the particular traffic law, and correlate the observed driving behavior of the vehicle 120 with the expected driving behavior to determine the level of compliance with, or the level of deviation from, the particular traffic law. The determined level of compliance or deviation may be compared with a corresponding threshold value to determine whether the vehicle 120 violated the particular traffic law. The detection system 400 may limit one or more operations of the vehicle in response to determining that the at least one driving score exceeds the threshold value.) Regarding Claim 19, Chaves teaches The system of claim 17, Chaves further teaches wherein the edge computing device is further configured to output the vehicle control signal to at least one of the remote vehicle and the road user. (see at least [0022, 0031, 0136]: The detection system 400 may alert other vehicles and pedestrians of the vehicle's potential safety risk. For example, in some instances, the one or more operations may include (but are not limited to) activating visual indicators (such as the vehicle's hazard lights) and/or audible indicators to alert other drivers of the vehicle's potential safety risk. In some other instances, the one or more operations may include instructing other vehicles to stay away from the vehicle or to increase their respective distances from the vehicle.) Regarding Claim 20, Chaves teaches The system of claim 16, Chaves further teaches wherein the edge computing device is further configured to output the vehicle behavior score to a system communicably coupled to the vehicle. (see at least Fig. 1-4 [0130-0141]: the one or more processors 410 may execute the reporting program 416 to generate a report indicating the number of identified occurrences of each dangerous driving attribute exhibited by the vehicle 120 during a certain time period and/or within a certain geographic area. In some instances, the detection system 400 may provide the report to one or more of the third-party entities 170. The one or more third-party entities may include (but are not limited to) a human driver of the vehicle, a human passenger of the vehicle, an owner of the vehicle, an insurer of the vehicle, a heads-up display of the vehicle, a law enforcement agency, one or more police vehicles, a government motor vehicle agency, or one or more other vehicles.) Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANA F ARTIMEZ/Examiner, Art Unit 3667 /FARIS S ALMATRAHI/Supervisory Patent Examiner, Art Unit 3667
Read full office action

Prosecution Timeline

Apr 19, 2024
Application Filed
Sep 10, 2025
Non-Final Rejection — §103
Dec 24, 2025
Response Filed
Mar 10, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596371
SYSTEM AND METHOD FOR INTERCEPTION AND COUNTERING UNMANNED AERIAL VEHICLES (UAVS)
2y 5m to grant Granted Apr 07, 2026
Patent 12573078
METHOD AND APPARATUS FOR DETERMINING VEHICLE LOCATION BASED ON OPTICAL CAMERA COMMUNICATION
2y 5m to grant Granted Mar 10, 2026
Patent 12571646
Automated Discovery and Monitoring of Uncrewed Aerial Vehicle Ground-Support Infrastructure
2y 5m to grant Granted Mar 10, 2026
Patent 12560441
METHOD AND APPARATUS FOR OPTIMIZING A MULTI-STOP TOUR WITH FLEXIBLE MEETING LOCATIONS
2y 5m to grant Granted Feb 24, 2026
Patent 12560936
SYSTEMS AND METHODS FOR OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+43.9%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 80 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month