Prosecution Insights
Last updated: April 19, 2026
Application No. 18/411,275

DRIVER CAPABILITY MONITORING

Final Rejection §102§103
Filed
Jan 12, 2024
Examiner
HEFLIN, HARRISON JAMES RIEL
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Bendix Commercial Vehicle Systems LLC
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
86%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
101 granted / 139 resolved
+20.7% vs TC avg
Moderate +13% lift
Without
With
+13.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
22 currently pending
Career history
161
Total Applications
across all art units

Statute-Specific Performance

§101
13.2%
-26.8% vs TC avg
§103
47.7%
+7.7% vs TC avg
§102
20.2%
-19.8% vs TC avg
§112
15.4%
-24.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 139 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 01/17/2024 has been considered by the Examiner. Specification The disclosure is objected to because of the following minor informalities: in paragraph [0012], “The autonomous function controller 12” appears as if it should instead read “The autonomous function controller 18” or equivalent in order to match fig. 2 and the surrounding disclosure. Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-2, 4-8, and 11-12 are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by James (US 2017/0166222 A1). Regarding claim 1, the Examiner notes that in paragraph [0022] of the present specification, Applicant defines the phrase “reasonably comparable” and recites “Reasonable comparison means that the response is near to the automated driving model and any difference does not affect the safe operation of the vehicle” and in similarly in paragraph [0026] that “Reasonable comparisons mean that there would be no impact to the safety of the vehicle or how the vehicle is operating in traffic.” James discloses a vehicle system for a vehicle capable of being both autonomously driven and human driven (In paragraph [0008], James discloses that an autonomous vehicle can have a manual operational mode and one or more autonomous operational modes) comprising: a plurality of sensors on the vehicle for transmitting information about the vehicle environment and the response of the vehicle to actions implemented by at least one of the autonomous control and the human control (In paragraphs [0025-0042], James discloses that the autonomous vehicle 100 can include a sensor system 120 that can include one or more vehicle sensors 121 that can be configured to detect, determine, assess, monitor, measure, quantify and/or sense information about the autonomous vehicle 100 itself, including manual driving data); a controller for implementing automated control on the vehicle (In paragraph [0019], James discloses that the autonomous vehicle 100 can include one or more processors 110; in paragraph [0057], James discloses that the autonomous vehicle 100 can include an autonomous driving module 155 including a control module 158; see also paragraphs [0117-0118], where James discloses that the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software such as a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein or can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein), the controller having: an input for receiving vehicle information from the plurality of sensors (In paragraphs [0025-0042], James discloses that the autonomous vehicle 100 can include a sensor system 120 that can include one or more vehicle sensors 121 that can be configured to detect, determine, assess, monitor, measure, quantify and/or sense information about the autonomous vehicle 100 itself, including manual driving data; in paragraphs [0057-0058], James discloses that the autonomous driving module 155 can include a perception module 156 which can receive data from the sensor system 120 and/or any other type of system capable of capturing information relating to the autonomous vehicle 100 and/or the external environment of the autonomous vehicle 100); a memory for storing an autonomous driving model and vehicle response results when the vehicle is under human control (In paragraph [0020], James discloses that the autonomous vehicle 100 can include one or more data stores 115 for storing one or more types of data; in paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; see also paragraphs [0117-0118], where James discloses that the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software such as a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein or can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein); and control logic for comparing the information of the vehicle response results when under human control to the autonomous driving model, wherein the control logic switches to autonomous control when the vehicle driving response results under human control are not reasonably comparable to the autonomous driving model (In paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; in paragraph [0102], James discloses that responsive to determining that the manual driving maneuver is unacceptable, feedback can be provided to a user (e.g., the human driver of the autonomous vehicle 100 or some other person), where the feedback can be active feedback; in paragraphs [0085-0086], James discloses that active feedback can include implementing one or more corrective actions implemented automatically by the autonomous vehicle 100, including any change in movement of the autonomous vehicle 100, such as a turn of the steering wheel position, activating or increasing braking, deactivating or decreasing braking, activating or increasing acceleration, deactivating or decreasing acceleration, and/or a movement in the lateral direction 106, or the corrective action may override or alter any manual driving inputs received from the human driver, for example to avoid a collision with another object or to avoid a hazardous condition, where such a corrective action can be based on a potential driving maneuver that would be selected by the planning/decision-making module 157). Regarding claim 2, James further discloses wherein the control logic further indicates to the human driver that the autonomous control has been activated (In paragraph [0087-0088], James discloses that active feedback can include the autonomous vehicle 100 providing haptic feedback to a human driver such as sending a control signal to one or more haptic actuators 136 associated with the vehicle seat 135 to cause the vehicle seat 135 (or a portion thereof) to vibrate). Regarding claim 4, James further discloses wherein the response of the vehicle information collected includes at least one of braking force, following distance to a forward vehicle, lane keeping, and distance to a non-vehicle obstacle (In paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the driving data can include data relating to one or more characteristics of the autonomous vehicle 100 (e.g., steering wheel position, brake pedal position, accelerator pedal position, wheel speed, any manual inputs provided by the human driver, etc.), and where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; in paragraphs [0065-0066], James discloses that one or more potential autonomous driving maneuvers can be scored according to a predetermined scoring standard, as an example, if the potential path or driving maneuver will result is the autonomous vehicle 100 passing within a predetermined distance from another object, then the score may be lowered, where the nature or identity of the object in the external environment can affect the predetermined distance and, thus, the scoring, for instance, the predetermined distance may be larger for some objects compared to the predetermined distance for other objects; in paragraph [0067], James discloses that the scoring can be affected by the amount which the autonomous vehicle 100 stays within a current travel lane or by the degree to which the autonomous vehicle 100 is centered in a current travel lane). Regarding claim 5, James further discloses wherein the autonomous driving model is created from vehicle information when the vehicle is under autonomous control (In paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode). Regarding claim 6, James further discloses wherein the autonomous driving model is loaded into the memory of the controller by one of a manufacturer of the vehicle (In paragraph [0024], James discloses that at least a portion of the data (e.g., map data, traffic rules data, driving scene models, and/or other data) can be located in one or more data stores 115 located onboard the autonomous vehicle 100, where the data can be obtained by the autonomous vehicle 100 in any suitable manner, or it can be provided by an entity (e.g., a vehicle manufacturer) for use by the autonomous vehicle 100). Regarding claim 7, the Examiner notes that in paragraph [0022] of the present specification, Applicant defines the phrase “reasonably comparable” and recites “Reasonable comparison means that the response is near to the automated driving model and any difference does not affect the safe operation of the vehicle” and in similarly in paragraph [0026] that “Reasonable comparisons mean that there would be no impact to the safety of the vehicle or how the vehicle is operating in traffic.” James discloses a controller for implementing automated driving functions on a vehicle, the vehicle being capable of both autonomous control and human control (In paragraph [0008], James discloses that an autonomous vehicle can have a manual operational mode and one or more autonomous operational modes; in paragraph [0019], James discloses that the autonomous vehicle 100 can include one or more processors 110; in paragraph [0057], James discloses that the autonomous vehicle 100 can include an autonomous driving module 155 including a control module 158; see also paragraphs [0117-0118], where James discloses that the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software such as a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein or can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein) comprising: an input for receiving information about the vehicle environment and the response of the vehicle to action implemented by at least one of the autonomous control and the human control from associated sensors on the vehicle (In paragraphs [0025-0042], James discloses that the autonomous vehicle 100 can include a sensor system 120 that can include one or more vehicle sensors 121 that can be configured to detect, determine, assess, monitor, measure, quantify and/or sense information about the autonomous vehicle 100 itself, including manual driving data; in paragraphs [0057-0058], James discloses that the autonomous driving module 155 can include a perception module 156 which can receive data from the sensor system 120 and/or any other type of system capable of capturing information relating to the autonomous vehicle 100 and/or the external environment of the autonomous vehicle 100); a memory for storing vehicle response information when the vehicle is under human control and an autonomous driving model (In paragraph [0020], James discloses that the autonomous vehicle 100 can include one or more data stores 115 for storing one or more types of data; in paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; see also paragraphs [0117-0118], where James discloses that the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software such as a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein or can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein); and control logic for comparing the information of the vehicle response when under human control to vehicle results when the vehicle is under autonomous control, wherein the control logic switches to automated control if the vehicle results under human control is not reasonably comparable to the vehicle results under autonomous control (In paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; in paragraph [0102], James discloses that responsive to determining that the manual driving maneuver is unacceptable, feedback can be provided to a user (e.g., the human driver of the autonomous vehicle 100 or some other person), where the feedback can be active feedback; in paragraphs [0085-0086], James discloses that active feedback can include implementing one or more corrective actions implemented automatically by the autonomous vehicle 100, including any change in movement of the autonomous vehicle 100, such as a turn of the steering wheel position, activating or increasing braking, deactivating or decreasing braking, activating or increasing acceleration, deactivating or decreasing acceleration, and/or a movement in the lateral direction 106, or the corrective action may override or alter any manual driving inputs received from the human driver, for example to avoid a collision with another object or to avoid a hazardous condition, where such a corrective action can be based on a potential driving maneuver that would be selected by the planning/decision-making module 157). Regarding claim 8, the Examiner notes that in paragraph [0022] of the present specification, Applicant defines the phrase “reasonably comparable” and recites “Reasonable comparison means that the response is near to the automated driving model and any difference does not affect the safe operation of the vehicle” and in similarly in paragraph [0026] that “Reasonable comparisons mean that there would be no impact to the safety of the vehicle or how the vehicle is operating in traffic.” James discloses a method for controlling a vehicle (In paragraph [0008], James discloses that an autonomous vehicle can have a manual operational mode and one or more autonomous operational modes) comprising: receiving information about the vehicle environment and the response of the vehicle to action implemented by at least one of the autonomous control and the human control from associated sensors on the vehicle (In paragraphs [0025-0042], James discloses that the autonomous vehicle 100 can include a sensor system 120 that can include one or more vehicle sensors 121 that can be configured to detect, determine, assess, monitor, measure, quantify and/or sense information about the autonomous vehicle 100 itself, including manual driving data; in paragraphs [0057-0058], James discloses that the autonomous driving module 155 can include a perception module 156 which can receive data from the sensor system 120 and/or any other type of system capable of capturing information relating to the autonomous vehicle 100 and/or the external environment of the autonomous vehicle 100); storing vehicle response information when the vehicle is under human control (In paragraph [0020], James discloses that the autonomous vehicle 100 can include one or more data stores 115 for storing one or more types of data; in paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; see also paragraphs [0117-0118], where James discloses that the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software such as a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein or can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein); comparing the information of the vehicle response when under human control to an autonomous driving model (In paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode); and switching to automated control in response to the vehicle results under human control not being reasonably comparable to the autonomous driving model (In paragraph [0102], James discloses that responsive to determining that the manual driving maneuver is unacceptable, feedback can be provided to a user (e.g., the human driver of the autonomous vehicle 100 or some other person), where the feedback can be active feedback; in paragraphs [0085-0086], James discloses that active feedback can include implementing one or more corrective actions implemented automatically by the autonomous vehicle 100, including any change in movement of the autonomous vehicle 100, such as a turn of the steering wheel position, activating or increasing braking, deactivating or decreasing braking, activating or increasing acceleration, deactivating or decreasing acceleration, and/or a movement in the lateral direction 106, or the corrective action may override or alter any manual driving inputs received from the human driver, for example to avoid a collision with another object or to avoid a hazardous condition, where such a corrective action can be based on a potential driving maneuver that would be selected by the planning/decision-making module 157). Regarding claim 11, James further discloses creating the autonomous driving model while the vehicle is under autonomous control and storing the autonomous driving model in a memory for comparison to the vehicle response when under human control (In paragraph [0020], James discloses that the autonomous vehicle 100 can include one or more data stores 115 for storing one or more types of data; in paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; in paragraph [0024], James discloses that at least a portion of the data (e.g., map data, traffic rules data, driving scene models, and/or other data) can be located in one or more data stores 115 located onboard the autonomous vehicle 100, where the data can be obtained by the autonomous vehicle 100 in any suitable manner, or it can be provided by an entity (e.g., a vehicle manufacturer) for use by the autonomous vehicle 100). Regarding claim 12, James further discloses entering a preconfigured autonomous driving model and storing the autonomous driving model in a memory for comparison to the vehicle response when under human control (In paragraph [0020], James discloses that the autonomous vehicle 100 can include one or more data stores 115 for storing one or more types of data; in paragraph [0097-0099], James discloses that the acquired driving data relating to one or more manual driving maneuvers (e.g., a human driver's execution of one or more driving maneuvers) can be evaluated relative to a driving scene model, including comparing past, current, and/or predicted manual driving maneuver(s) of the autonomous vehicle 100 to a driving scene model, where the determination can be made relative to a predetermined standard, such as the same standard that is used to determine whether a potential autonomous driving maneuver is acceptable or unacceptable when the autonomous vehicle 100 is operating in an autonomous operational mode; in paragraph [0024], James discloses that at least a portion of the data (e.g., map data, traffic rules data, driving scene models, and/or other data) can be located in one or more data stores 115 located onboard the autonomous vehicle 100, where the data can be obtained by the autonomous vehicle 100 in any suitable manner, or it can be provided by an entity (e.g., a vehicle manufacturer) for use by the autonomous vehicle 100). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over James (US 2017/0166222 A1), in view of Stent (US 2023/0278572 A1). Regarding claim 3, James does not explicitly disclose wherein the human driver can override the autonomous control and the memory will record the human override action. However, Stent teaches wherein the human driver can override the autonomous control and the memory will record the human override action (In paragraph [0030], Stent teaches that an override mechanism, such as an override switch, may be used to turn off or disengage a vehicle’s autonomous control system; in paragraph [0089], Stent teaches that the system can determine that the driver turned off an automatic ADAS feature and chose not to use it in past trips, or has repeatedly turned off an ADAS feature during situations where the feature would be appropriate). Stent is considered to be analogous to the claimed invention in that they both pertain to overriding autonomous control and recording the overrides. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Stent with the system as disclosed by James, where doing so “can train the system to accurately determine when a recommendation is appropriate and likely successful” as suggested by Stent in paragraph [0034], advantageously utilizing the recorded overrides to increase the contextual sensitivity and accuracy of the system, for example. Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over James (US 2017/0166222 A1), in view of Avedisov (US 2025/0187619 A1). Regarding claim 9, James does not explicitly disclose wherein switching to automated control occurs in a restricted driving zone. However, Avedisov teaches wherein switching to automated control occurs in a restricted driving zone (In paragraph [0077], Avedisov teaches that the vehicle is equipped with GPS and advanced sensors to detect when the vehicle enters or is about to enter a designated autonomous-only zone, and upon entering a zone, the vehicle's sensors send the geographic location data to the server, the server confirms the vehicle's presence in an autonomous-only zone and sends a command back to the vehicle to activate the autonomous driving mode, and as the autonomous mode is activated, the vehicle's user interface notifies the driver, informing the driver that the vehicle is now in autonomous mode and that manual controls are temporarily disabled). Avedisov is considered to be analogous to the claimed invention in that they both pertain to switching to automated control in a restricted driving zone. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Avedisov with the method as disclosed by James where the designated autonomous-only zones “may be in areas in urban settings where manual driving is either less efficient due to heavy traffic congestion or prohibited by local traffic laws to facilitate smoother traffic flow” as suggested by Avedisov, advantageously increasing the efficiency of operations of the autonomous vehicles, for example. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over James (US 2017/0166222 A1), in view of Chaves (US 2022/0355802 A1). Regarding claim 10, James does not explicitly disclose wherein switching to automated control means that the speed of the vehicle when under human control is limited to a predetermined speed. However, Chaves teaches wherein switching to automated control means that the speed of the vehicle when under human control is limited to a predetermined speed (In paragraphs [0164-0165], Chaves teaches taking corrective actions associated with a vehicle based on the at least one driving score, including limiting a speed of the vehicle or limiting the vehicle to a speed within a certain amount or percentage over a posted speed limit and disabling or restricting one or more features of a manual driving mode of the vehicle). Chaves is considered to be analogous to the claimed invention in that they both pertain to limiting a maximum speed of a vehicle as a corrective action based on a driver’s performance. It would be obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to implement the teachings of Chaves with the method as disclosed by James, where “the detection system 400 may gradually increase a vehicle's top speed in conjunction with increases in the vehicle's driving score” as suggested by Chaves in paragraph [0140], thereby advantageously increasing safety of operation of the vehicle, for example. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Newman (US 2025/0214576 A1) teaches automatic collision-avoidance and/or harm-minimization of vehicle collisions. Jean (US 2024/0253624 A1) teaches implementing contextual speed limits in ISA system having both positioning and situational-aware subsystems. Kuwahara (US 11,209,819 B2) teaches a vehicle driving control system, where during traveling under an automatic driving mode, when the driver state determining unit determines from a driver state detected by the driver monitoring device that the driver is not in a state capable of driving normally, the retreat mode controller sets a retreat mode in which an override operation is disabled and an own vehicle is caused to travel for retreat Kaino (US 2021/0323568 A1) teaches a vehicle control system, where a controller executes automatic stop control in the case where the driver abnormality is determined. Yun (US 2020/0341465 A1) teaches an apparatus and method for controlling driving of a vehicle, including determining whether to switch to autonomous driving of the subject vehicle based on the autonomous driving data and manual driving data of the subject vehicle and the autonomous driving data of the another vehicle. Mahajan (US 2020/0216079 A1) teaches systems and methods for driver profile based warning and vehicle control. Prokhorov (US 2018/0164808 A1) teaches mixed autonomous and manual control of a vehicle. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Harrison Heflin whose telephone number is (571)272-5629. The examiner can normally be reached Monday - Friday, 1:00PM - 10:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HARRISON HEFLIN/ Examiner, Art Unit 3665 /HUNTER B LONSBERRY/ Supervisory Patent Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Jan 12, 2024
Application Filed
Aug 22, 2025
Non-Final Rejection — §102, §103
Oct 31, 2025
Response Filed
Dec 19, 2025
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596369
CONTROL SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12566443
ROBOT TRAVELING IN SPECIFIC SPACE AND CONTROL METHOD THEREOF
2y 5m to grant Granted Mar 03, 2026
Patent 12559894
SYSTEMS AND METHODS TO APPLY SURFACE TREATMENTS
2y 5m to grant Granted Feb 24, 2026
Patent 12541202
UNMANNED VEHICLE AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Feb 03, 2026
Patent 12497275
APPARATUS FOR MOVING A PAYLOAD
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
86%
With Interview (+13.0%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 139 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month