Prosecution Insights
Last updated: April 19, 2026
Application No. 17/682,698

AUTOMATIC DRIVER ASSIST SYSTEM FOR A VEHICLE

Non-Final OA §102§103
Filed
Feb 28, 2022
Examiner
ALKIRSH, AHMED
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Nissan North America, Inc.
OA Round
5 (Non-Final)
54%
Grant Probability
Moderate
5-6
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 54% of resolved cases
54%
Career Allow Rate
23 granted / 43 resolved
+1.5% vs TC avg
Strong +54% interview lift
Without
With
+53.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
63 currently pending
Career history
106
Total Applications
across all art units

Statute-Specific Performance

§101
20.2%
-19.8% vs TC avg
§103
54.5%
+14.5% vs TC avg
§102
22.5%
-17.5% vs TC avg
§112
2.8%
-37.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 43 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination (RCE) under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/24/2025 has been entered. Status of Claims Claims 1-20 of U.S. Application No. 17/682,698 filed on 02/28/2022 were examined. Examiner filed a non-final rejection on 02/27/2024. Applicant filed remarks on 05/23/2024. Claims 1-20 were examined. Examiner filed a final rejection on 08/13/2024. Applicant filed an RCE on 12/13/24. Claims 1 and 14 were amended. Claims 1-20 were examined. Examiner filed a non-final rejection on 02/25/2025. Applicant filed remarks on 05/16/2025. Claims 1-20 were examined. Examiner filed a final rejection on 08/27/2025. Applicant filed an RCE on 11/24/2025. Claims 1, 12, 14, 18 and 19 were amended. Claims 1-20 are presented and pending examination. Response to Arguments Regarding the claim rejections under 35 USC 102 and 103: Applicant's arguments filed 11/24/2025 with respect to Sun et al. (US 9308914 B1) in view of Singh (US 20180231976 A1) have been fully considered but they are not persuasive. Regarding claims 1 and 14, The applicant argues that Independent claims 1 and 14 recite, inter alia, a controller configured to activate one of a plurality of vehicle drive assist system components from a deactivated state based on the driver data and on the external environment data without manual intervention of the driver, and the activated vehicle drive assist system component being activated prior to operation of the activated vehicle drive assist system component. Applicant argues that Sun does not disclose this limitation because Sun activates the driver assistance feature based only on driver status information, as shown in FIG. 5 of Sun, and operates the driver assistance feature based only on environmental status information as shown in FIG. 7. However, the examiner respectfully disagrees, this argument is not persuasive. Sun discloses activation of a driver assistance feature (one of a plurality of components, such as PCS, ACC, or LKA) from a deactivated state (when the activation switch 22 is turned off, i.e., S 502 : No of FIG. 5) based on driver data (from DSM 24 in step S504 of FIG. 5, determining abnormal situation in step S505 such as drowsiness in step S601 of FIG. 6) and external environment data (from environmental sensor 14 used in step S701 of FIG. 7 during performance in step S509 of FIG. 5, where the target behavior is defined based on the environment surrounding the vehicle). The activation occurs without manual intervention of the driver (automatic activation in step S506 when switch is off), and the component is activated prior to its operation (activation in S506 prior to performance in S509 using environmental data). The integration of both data types is met by Sun’s process, where driver data triggers activation and environmental data defines the target behavior and enables operation. (See Sun, col. 4-5, lines 63-67 & 1-19: “FIG. 5 shows a flowchart for a routine 500 that activates the driver assistance features. The routine 500 begins immediately after the vehicle system is started up (by ignition on). Firstly, the routine 500 sets a flag to “0” to initialize the status at step 501 and determines whether the activation switch 22 is turned on by the driver in step 502 . If the activation switch 22 is turned on (S 502 : Yes), the routine 500 sets a flag to “1” at step 503 . If the activation switch 22 is not turned on (S 502 : No), then a driver status determination is executed in step 504 . At step 505 , the routine 500 determines whether the abnormal situation is determined. If the abnormal situation is determined (S 505 : Yes), the routine 500 activates the driver assistance features automatically at step 506 and subsequently sets the flag to “2” at step 507 . Conversely, if the abnormal situation does not exist or is no longer detected (S 505 : No), the routine verifies the driver assistance features is deactivated or suppressed automatically at step 508 . In other words, if the driver assistance feature was previously activated due to the abnormal situation, the driver assistance feature is deactivated at step 508 because the driving safety feature is no longer needed. After the flagging at steps 503 and 507 , the driver assistance features are performed at step 509 . The routine 500 returns to step 501 and repeats until the vehicle system is shut down.”; see also Sun, col. 6, lines 42-49: “FIG. 7 shows a flowchart for a routine 700 that performs the driver assistance features which is executed at step 509 in FIG. 5 . At step 701 , the routine acquires the environmental data from the environmental sensors 36 , 38 , and 40 . Subsequently, the routine 700 performs the driver assistance features such as PCS, ACC, and/or LKA selectively or simultaneously at steps 702 , 703 , and 704 and the routine 700 goes to the end.”). The applicant also argues that Sun discloses an advanced driver assistance system (ADAS) 10, as shown in FIG. 1, including an environmental sensor 14, an activation switch 22, and a driver status monitor (DSM) 24. The DSM 24 includes a driver-facing camera 48 and a health scanner 50, as shown in FIG. 4. The environmental sensor 14 includes a camera 36, a radar 38, and/or a sonar 40 to detect the environment surrounding the vehicle. Applicant implies that this structure does not teach activation based on both driver data and external environment data. However, the examiner respectfully disagrees, this argument is not persuasive. Applicant’s summary of Sun does not point to any deficiency in the reference; rather, it supports the anticipation rejection. Sun’s DSM 24 (first sensor) detects driver data (e.g., via camera 48 and scanner 50 for status like drowsiness), and environmental sensor 14 (second sensor) detects external environment data (via camera 36, radar 38, sonar 40), which are used in the activation and operation process as explained above. (See Sun, col. 4, lines 27-45: “The driver status module 34 receives driver status data from the DSM 24 to determine the abnormal situation. The DSM 24 is installed in the vehicle cabin to monitor the driver. As shown in FIG. 4 , the DSM 24 employs one or more activity sensors such as a driver-facing camera 48 , a health scanner 50 , and an instrument panel 52 to monitor activities performed by the driver. Based on the activity sensors 48 , 50 , and 52 , the driver status module 34 determines whether the driver is, for example, distracted, sick, or drowsy as the abnormal situation. The driver-facing camera 48 may be mounted at the meter console to capture the driver’s face, especially the driver’s eyes. The driver status module 34 processes data received from the driver-facing camera 48 and monitors whether the driver looks away from the road based on the driver’s gaze direction. If the driver looks away, the driver status module 34 determines the abnormal situation. The driver status module 34 may also determine whether the driver is drowsy or alert based on how much the driver’s eye opens and for how long.”). The applicant also argues that “FIG. 7 of Sun is a flowchart illustrating performance of the activated driver assistance feature that is executed in step S509 of FIG. 5. As shown in step S701 of FIG. 7, data obtained from the environmental sensor 14 is used to perform the driver assistance feature. In steps S702, S703 and S704, the driver assistance features, such as pre-collision safety brake (PCS), adaptive cruise control (ACC) or lane keeping assistance (LKA), are performed selectively or simultaneously. See col. 6, lines 42 – 49 of Sun. However, the environmental sensor data acquired in step S701 of FIG. 7 is executed during step S509 of FIG. 5, as described in col. 6, lines 42 – 49 of Sun. Thus, the environmental sensor data is obtained in step S509 of FIG. 5 of Sun after activating the driver assistance feature in step S506 of FIG. 5. The driver status data is used to determine a status of the driver in the steps illustrated in FIG. 6, which is executed during step S504 of FIG. 5. Thus, the driver status data is used to determine whether a driver assistance feature should be activated, and then the environmental sensor data is used to operate the activated driver assistance feature, as shown in steps S506 and S509 of FIG. 5 of Sun.”. Applicant argues that this shows environmental data is only used after activation, not for the activation itself. However, the examiner respectfully disagrees, this argument is not persuasive. Applicant’s interpretation is overly narrow. The claim recites activation “based on the driver data and on the external environment data,” which Sun meets because the activation process (steps S502-S506) occurs in a context where environmental data is available from sensor 14, and activation leads to operation using that data in S509, without manual intervention. Environmental data informs the target behavior (defined based on the environment surrounding the vehicle), supporting that activation is based on both. (See Sun, col. 1, lines 5-28: “The present disclosure relates to an advanced driver assistance system (ADAS) for a vehicle. An ADAS includes driver assistance features such as a pre-collision safety (PCS), an adaptive cruise control (ACC), and/or a lane keeping assist (LKA). These driver assistance features are built into the subject vehicle to help the driver avoid a collision, follow a preceding vehicle, and/or keep the vehicle in its lane. The ADAS are equipped with one or more sensors, such as an imaging camera, a millimeter-wave/laser radar, and/or a sonar sensor. The sensors detect an environment surrounding the subject vehicle including an approaching object, a crossing pedestrian, a preceding vehicle, and/or lane markers. Based on the environment detected by the sensors, the ADAS controls the subject vehicle automatically by actuating throttle, brake and/or steering to avoid accidents.”; see also Sun, col. 6, lines 50-60: “FIG. 8 shows a flowchart for a routine 800 that performs the PCS which is executed at step 702 in FIG. 7 . The routine 800 determines whether the approaching object is detected based on the environmental sensors 14 . If the approaching object is detected (S 801 : Yes), the routine 800 calculates a time-to-collision (TTC) at step 802 . The routine 800 determines whether the flag is set to “1” or “2” at step 803 . If the flag is set to “1”, a time threshold (Time_Th) is set to a first threshold at step 804 . If the flag is set as “2”, the time threshold (Time_Th) is set to a second threshold longer than the first threshold at step 805 .”). Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-4, 6-16 and 18-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sun et al. (US 9308914 B1), hereinafter referred to as Sun. Regarding claims 1 and 14, Sun discloses An automatic driver assist system for a vehicle comprising: a first sensor configured to detect driver data relating to a current state of a driver of the vehicle (“The driver-facing camera 48 may be mounted at the meter console to capture the driver's face, especially the driver's eyes.” [Col.4 ln 37-45]); a second sensor configured to capture external environment data relating to an external environment of the vehicle (“The radar 38 may be millimeter-wave/laser radar mounted in a front bumper. The sonar 40 may be one or more ultrasonic sensors embedded in a rear bumper.” [Col.4 ln 20-27]); and a controller configured to activate one of a plurality of vehicle drive assist system components from a deactivated state based on the driver data and on the external environment data without manual intervention of the driver, the activated vehicle drive assist system component being configured to assist the driver in operation of the vehicle (“a method for controlling an advanced driver assistance system of a vehicle for performing one or more driver assistance features includes steps of detecting, monitoring, receiving, performing, and activating. The determining step determines an environment surrounding the vehicle. The monitoring step monitors the driver and determines an abnormal situation of the driver. The receiving step receives a signal from an activation switch operable by the driver for activating or deactivating the driver assistance features. The performing step performs the one or more driver assistance features for controlling the vehicle based on a target behavior, where the target behavior is defined based on the environment surrounding the vehicle. The activating step activates one or more driver assistance features when the activation switch is set to deactivate and the abnormal situation is determined.” [Col.2 ln 1-16]) the activated vehicle drive assist system component being activated prior to operation of the activated vehicle drive assist system component; ( (See Sun, col. 4-5, lines 63-67 & 1-19: “If the activation switch 22 is not turned on (S 502 : No), then a driver status determination is executed in step 504 . At step 505 , the routine 500 determines whether the abnormal situation is determined. If the abnormal situation is determined (S 505 : Yes), the routine 500 activates the driver assistance features automatically at step 506 and subsequently sets the flag to “2” at step 507 . … After the flagging at steps 503 and 507 , the driver assistance features are performed at step 509 . The routine 500 returns to step 501 and repeats until the vehicle system is shut down.”; see also Sun, col. 6, lines 42-49: “FIG. 7 shows a flowchart for a routine 700 that performs the driver assistance features which is executed at step 509 in FIG. 5 . At step 701 , the routine acquires the environmental data from the environmental sensors 36 , 38 , and 40 . Subsequently, the routine 700 performs the driver assistance features such as PCS, ACC, and/or LKA selectively or simultaneously at steps 702 , 703 , and 704 and the routine 700 goes to the end.”). the driver assistance controller activates one or more driver assistance features (e.g., PCS, ACC, or LKA, which are vehicle drive assist system components) from a deactivated state (when the activation switch 22 is turned off, i.e., S502: No) based on driver data (abnormal situation determined in S505 from driver status data in FIG. 6) and external environment data (the features are defined and performed based on environmental data acquired in S701 of FIG. 7, and the overall system integrates environmental detection for target behavior), without manual intervention of the driver (automatic activation in S506 when switch is off and abnormal situation detected), with the activation occurring prior to operation (activation in S506 prior to performance in S509, where operation uses environmental data to control the vehicle). Regarding claims 2, 15 and 19, Sun discloses The automatic driver assist system according to claim 1, wherein the controller is further configured to deactivate the one of the plurality of vehicle drive assist system components based on the driver data (“The determining step determines an environment surrounding the vehicle. The monitoring step monitors the driver and determines an abnormal situation of the driver. The receiving step receives a signal from an activation switch operable by the driver for activating or deactivating the driver assistance features. The performing step performs the one or more driver assistance features for controlling the vehicle based on a target behavior, where the target behavior is defined based on the environment surrounding the vehicle. The activating step activates one or more driver assistance features when the activation switch is set to deactivate and the abnormal situation is determined.” [Col.2 ln 1-16]). Regarding claims 3 and 20, Sun discloses The automatic driver assist system according to claim 1, wherein the driver data includes an attention level of the driver (“The driver-facing camera 48 may be mounted at the meter console to capture the driver's face, especially the driver's eyes. The driver status module 34 processes data received from the driver-facing camera 48 and monitors whether the driver looks away from the road based on the driver's gaze direction. If the driver looks away, the driver status module 34 determines the abnormal situation. The driver status module 34 may also determine whether the driver is drowsy or alert based on how much the driver's eye opens and for how long.” [Col.4 ln 37-45]). Regarding claims 4 and 16, Sun discloses The automatic driver assist system according to The automatic driver assist system according to wherein the controller is further configured to bring the vehicle to a stop when the first sensor detects the attention level of the driver is less than a predetermined threshold for a predetermined amount of time (“The PCS prevents the vehicle from hitting an object detected within the surrounding of the vehicle. When the environmental sensor 38, 38, 40 detect the object approaching the vehicle or a pedestrian crossing in front of the vehicle, the vehicle target operation module 30 calculates a time-to-collision (TTC) of the object. If the TTC falls below a predefined threshold, the vehicle target operation module 30 actuates the brake 44 to decelerate or stop the vehicle. This kind of driver assistance feature may also be known as a forward collision warning (FCW) which provides the alert to the driver without actuating the brake 42 automatically.” [Col.5 ln 13-23]). Regarding claims 6 and 18, Sun discloses The automatic driver assist system according to claim 1, wherein the controller is further configured to provide a visual indication to the driver when the one of the plurality of vehicle drive assist system components is activated (“The HMIs 20 notifies the driver of a possible safety concern or notifies the driver of whether and how the driver assistance feature is working. The HMIs 20 include a display for providing a visible alert, a speaker for providing an audible alert, and a vibration motor for providing a haptic alert to the driver.” [Col.3-4 ln 62-67 & 1-2]). Regarding claim 7, Sun discloses The automatic driver assist system according to claim 1, wherein the controller is further configured to provide an audible indication when the one of the plurality of vehicle drive assist system components is activated (“The HMIs 20 notifies the driver of a possible safety concern or notifies the driver of whether and how the driver assistance feature is working. The HMIs 20 include a display for providing a visible alert, a speaker for providing an audible alert, and a vibration motor for providing a haptic alert to the driver.” [Col.3-4 ln 62-67 & 1-2]). Regarding claim 8, Sun discloses The automatic driver assist system according to The automatic driver assist system according to wherein the one of the plurality of vehicle drive assist system components is configured to assist lateral control of the vehicle (“The vehicle target operation module 30 receives the environmental data and the operation data to calculate a target behavior suitable for the environment. Depending on the target behavior, the vehicle target operation module 30 controls the vehicle actuators 18 and performs the driver assistance features by generating accelerating force, decelerating force, and/or steering force to steer the vehicle wheels. The driving assistance features assist the driver automatically or semi-automatically with driving the vehicle safely so as to approach the target behavior. As shown in FIG. 3, the vehicle actuators 18 include a throttle 42 of an engine, a brake 44 of wheels, and a steering 46 of front wheels.” [Col.3 ln 47-57]). Regarding claim 9, Sun discloses The automatic driver assist system according to The automatic driver assist system according to wherein the one of the plurality of vehicle drive assist system components is configured to assist longitudinal control of the vehicle (“The vehicle target operation module 30 receives the environmental data and the operation data to calculate a target behavior suitable for the environment. Depending on the target behavior, the vehicle target operation module 30 controls the vehicle actuators 18 and performs the driver assistance features by generating accelerating force, decelerating force, and/or steering force to steer the vehicle wheels. The driving assistance features assist the driver automatically or semi-automatically with driving the vehicle safely so as to approach the target behavior. As shown in FIG. 3, the vehicle actuators 18 include a throttle 42 of an engine, a brake 44 of wheels, and a steering 46 of front wheels.” [Col.3 ln 47-57]). Regarding claim 10, Sun discloses The automatic driver assist system according to claim 3, wherein the attention level of the driver includes at least one of a visual attention level, a control level and a drowsiness level of the driver (“The driver-facing camera 48 may be mounted at the meter console to capture the driver's face, especially the driver's eyes. The driver status module 34 processes data received from the driver-facing camera 48 and monitors whether the driver looks away from the road based on the driver's gaze direction. If the driver looks away, the driver status module 34 determines the abnormal situation. The driver status module 34 may also determine whether the driver is drowsy or alert based on how much the driver's eye opens and for how long.” [Col.4 ln 37-45]). Regarding claim 11, Sun discloses The automatic driver assist system according to claim 1, wherein the automatic driver assist system is configured to be set on or off through a vehicle interface (“The driver may activate or deactivate each of the driver assistance features individually via the activation switch 22 which may be disposed at the steering wheel. When the activation switch 22 is turned on, the activation module 32 determines that the driver has activated the driving assistant features. The activation module 32 informs the vehicle target operation module 30 to activate the driving assistance features. Conversely, when the activation switch 22 is turned off, the activation module 32 determines that the driver has not activated the driving assistant features.” [Col.4 ln 3-15] and “the vehicle target operation module 30 may further control the HMIs 20 at the same time or prior to actuating the vehicle actuators 18.” [Col.3 ln 62-67]). Regarding claim 12, Sun discloses The automatic driver assist system according to claim 1, wherein the controller is further configured to provide an indication to the driver when the activation of the one of the plurality of vehicle drive assist system components is being activated (“The activation module 32 has a flag indicating how the driver assistance feature is activated. The flag “0” indicates the status is initialized. The flag “1” indicates the driver assistance feature is activated manually by the driver via the activation switch 22. The flag “2” indicates the driver assistance feature is activated automatically by the system based on the driver status.” [Col.4 ln 20-26]). Regarding claim 13, Sun discloses The automatic driver assist system according to claim 1, wherein each of the plurality of vehicle drive assist system components is configured to be set on or off through a vehicle interface to allow or prevent activation by the controller(“The driver may activate or deactivate each of the driver assistance features individually via the activation switch 22 which may be disposed at the steering wheel. When the activation switch 22 is turned on, the activation module 32 determines that the driver has activated the driving assistant features. The activation module 32 informs the vehicle target operation module 30 to activate the driving assistance features. Conversely, when the activation switch 22 is turned off, the activation module 32 determines that the driver has not activated the driving assistant features.” [Col.4 ln 3-15] and “the vehicle target operation module 30 may further control the HMIs 20 at the same time or prior to actuating the vehicle actuators 18.” [Col.3 ln 62-67]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 5 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Sun in view of Singh (US 20180231976 A1), hereinafter referred to as Sun and Singh respectively. Regarding claims 5 and 17, Sun discloses The automatic driver assist system according to claim 1, Sun does not explicitly teach wherein the controller is further configured to prevent another one of the plurality of vehicle drive assist system components from being activated based on the external environment data However, Singh does teach wherein the controller is further configured to prevent another one of the plurality of vehicle drive assist system components from being activated based on the external environment data (“When driving assisted, in most cases at which the driver is highly attentive, the system's warnings and interventions may be felt more or less anxious or overprotective and may often bother the driver, which may lead to the driver disengaging the according DAS. For example, the driver may tend to shorten the distance to the vehicle at the same lane ahead, when preparing for an overtaking maneuver on a highway. An Adaptive Cruise Control (ACC) system may intervene by braking when underrunning the set desired distance to the vehicle ahead. This may not benefit the overtaking maneuver that the driver had in mind.” [0055]). Both Sun and Singh teach methods for driver assist control. However, only Singh explicitly teaches wherein the controller is further configured to prevent another one of the plurality of vehicle drive assist system components from being activated based on the external environment data. It would have been obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to modify the driver assist control method of Sun to also include wherein the controller is further configured to prevent another one of the plurality of vehicle drive assist system components from being activated based on the external environment data, as taught by Singh, with a reasonable expectation of success. Doing so improves safety for operating a vehicle by providing valuable information that can be used to improve the driver assist features (With regard to this reasoning, see at least [Singh, 0004]). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to AHMED ALKIRSH whose telephone number is (703) 756-4503. The examiner can normally be reached M-F 9:00 am-5:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FADEY JABR can be reached on (571) 272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AA/Examiner, Art Unit 3668 /Fadey S. Jabr/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Feb 28, 2022
Application Filed
Feb 21, 2024
Non-Final Rejection — §102, §103
May 23, 2024
Response Filed
Jul 29, 2024
Final Rejection — §102, §103
Nov 13, 2024
Response after Non-Final Action
Dec 13, 2024
Request for Continued Examination
Dec 17, 2024
Response after Non-Final Action
Feb 20, 2025
Non-Final Rejection — §102, §103
May 16, 2025
Response Filed
Aug 22, 2025
Final Rejection — §102, §103
Oct 21, 2025
Interview Requested
Nov 06, 2025
Applicant Interview (Telephonic)
Nov 09, 2025
Examiner Interview Summary
Nov 24, 2025
Request for Continued Examination
Dec 05, 2025
Response after Non-Final Action
Jan 26, 2026
Non-Final Rejection — §102, §103
Mar 04, 2026
Interview Requested
Apr 02, 2026
Interview Requested
Apr 13, 2026
Applicant Interview (Telephonic)
Apr 14, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12578724
Detection of Anomalous Trailer Behavior
2y 5m to grant Granted Mar 17, 2026
Patent 12410589
METHODS AND SYSTEMS FOR IMPLEMENTING A LOCK-OUT COMMAND ON LEVER MACHINES
2y 5m to grant Granted Sep 09, 2025
Patent 12403908
NON-SELFISH TRAFFIC LIGHTS PASSING ADVISORY SYSTEMS
2y 5m to grant Granted Sep 02, 2025
Patent 12370903
METHOD FOR TORQUE CONTROL OF ELECTRIC VEHICLE ON SLIPPERY ROAD SURFACE, AND TERMINAL DEVICE
2y 5m to grant Granted Jul 29, 2025
Patent 12325450
SYSTEMS AND METHODS FOR GENERATING MULTILEVEL OCCUPANCY AND OCCLUSION GRIDS FOR CONTROLLING NAVIGATION OF VEHICLES
2y 5m to grant Granted Jun 10, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
54%
Grant Probability
99%
With Interview (+53.7%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 43 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month