Prosecution Insights
Last updated: April 17, 2026
Application No. 17/805,435

In-Vehicle Occupant Monitoring and Calming System

Final Rejection §103
Filed
Jun 03, 2022
Examiner
ARTIMEZ, DANA FERREN
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Aptiv Technologies AG
OA Round
4 (Final)
58%
Grant Probability
Moderate
5-6
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
46 granted / 80 resolved
+5.5% vs TC avg
Strong +44% interview lift
Without
With
+43.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
42 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
19.0%
-21.0% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
7.3%
-32.7% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 80 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art. Status of the Claims This is a Final Office Action in response to Applicant’s amendment of 29 April 2025. Claims 1-10, 14 and 17-20 are pending and have been considered as follows. Response to Amendment and/or Arguments Applicant’s amendments and/or arguments with respect to the Claim Rejection of Claims 1, 14, and 20 under 35 USC 103 as set forth in the office action of 29 January 2025 have been considered and are NOT persuasive. Specifically, Applicant argues (Pages 10-13 of Applicant’s Remarks dated 29 April 2025), that “The proposed combination of Yi and Upmanue’528 is improper…” and the references both fail to disclose or suggested the newly amended features…, the rejections of the claims under 35 U.S.C. 103 should be reconsidered and withdrawn. However, all limitations do not require a suggestion in the prior art references being applied. Limitations may also be found in the understanding of one of ordinary skilled in the art. See MPEP 2141, III (“Prior art is not limited just to the references being applied, but includes the understanding of one of ordinary skill in the art. The prior art references (or references when combined) need not teach or suggest all the claim limitations.”) Furthermore, upon careful reviews of prior arts of records and the newly amended limitations filed on 29 April 2025, independent claims 1, 14 and 20 are rejected over Upmanue’528 in view of Upmanue’803 and Ekchian. Accordingly, Applicant’s arguments are NOT persuasive. See 35 U.S.C. 103 rejections below for details. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-4, 7-8, 14, 17 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Upmanue et al. (US 2020/0310528 A1 hereinafter Upmanue’528) in view of Upmanue et al. (US2020/0282803 A1 hereinafter Upmanue’803) and Ekchian et al. (US 2021/0221258 A1 hereinafter Ekchian). Regarding Claim 1 (similarly claims 14 and 20), Upmanure’528 teaches A method (see at least Fig. 2 Abstract) comprising: obtaining occupant data from one or more occupancy-monitoring sensors of a vehicle; ((see at least Fig. 1-4 [0003-0044]: the OSM may employ one or more activity sensors such as an occupant facing camera, a health scanner, and in instrument panel to monitor activities performed by occupants. determining, based on the occupant data, whether an occupant of the vehicle is distressed, the occupant being a minor seated in a rear seat of the vehicle; (see at least Fig. 1-4 [0003-0044]: The OSM may also employ a mic that is in communication with a voice recognition (VR) engine that can detect fussiness of a child (e.g. baby) or irregular crying or sounds from the child. Based on the activity sensors, the OSM may determine whether the occupant is, for example, fuss, experiencing motion sickness, hunger and fever. For example, in a rear-seat occupant stress out scenario, the HMI display 115 may display message to the driver that the rear-seat occupant is stressed and need comfort.) in response to determining that the occupant is distressed, displaying, on a display located in a field of view of a driver of the vehicle, an image or video of the occupant; ; (see at least Fig. 1-4 [0003-0044]: For example, in a rear-seat occupant stress out scenario, the HMI display 115 may display message to the driver that the rear-seat occupant is stressed and need comfort. The HMI display 115 may utilize any type of monitor or display utilized to display relevant information to the occupants. The HMI display 115 may also include a heads-up display (“HUD”) that is utilized to display an interface and other objects on a windshield so that the images are within a driver's periphery while driving or in an occupant's line of sight.) displaying, on the display along with the image or video, a plurality of selectable options for the driver to select to calm or soothe the occupant (see at least Fig. 1-4 [0003-0044]: the system may constantly detects an occupant’s mood or emotion to determine if emotional intervention is required. The system may have defined moods that are utilized based on the data collected from various sensors in the vehicle. The system may determine potential solutions based on historical data or generic data. The solutions may be output onto a display and provide instructions to given to a driver. The solution may be output on a display of the vehicle (e.g. an instrument panel display, heads-up display (HUD), or navigation display. The solution may also include to the driver suggestions to activate certain music, certain vehicle features (e.g. massage features, heating/cooling settings, ambient lighting, etc.).) , the plurality of selectable options including: playing an audio file in the vehicle; (see at least Fig. 1-4 [0003-0044]: The solution may also include to the driver suggestions to activate certain music, certain vehicle features (e.g. massage features, heating/cooling settings, ambient lighting, etc.). adjust ambient lighting of the vehicle by dimming or brightening interior lights of the vehicle; (see at least Fig. 1-4 [0003-0044]: The solution may also include to the driver suggestions to activate certain music, certain vehicle features (e.g. massage features, heating/cooling settings, ambient lighting, etc.). activating actuators or motors in or attached to the rear seat to introduce a vibration or motion pattern; (see at least Fig. 1-4 [0003-0044]: The solution may also include to the driver suggestions to activate certain music, certain vehicle features (e.g. massage features, heating/cooling settings, ambient lighting, etc.).and executing at least one of the selectable options when selected by the driver. (see at least Fig. 1-4 [0003-0044]: In addition to providing visual indications, the HMI display 115 may also be configured to receive user input via a touch-screen, user interface buttons, etc. The HMI display 115 may be configured to receive user commands indicative of various vehicle controls such as audio-visual controls, autonomous vehicle system controls, certain vehicle features, cabin temperature control, etc. The controller 101 may receive such user input and in turn command a relevant vehicle system of component to perform in accordance with the user input.) it may be alleged that Upmanure’528 does not explicitly teach playing a multimedia file on a rear entertainment system of the vehicle; displaying an image or video of the driver to the occupant by way of the rear entertainment system; raising or lowering sunshades of the vehicle; and activating additional motors in or attached to the rear seat to move the rear seat forward and backward to introduce a rocking motion; Upmanue’803 is directed to baby comfort monitoring system and method for a vehicle, Upmanue teaches playing a multimedia file on a rear entertainment system of the vehicle; (see at least Fig. 3-4 [0022-0036]: system may activate audio system to play soothing music to assist baby into falling asleep, activate audio system to play lively/engaging music or play a movie of video; ) displaying an image or video of the driver to the occupant via the rear entertainment system; (see at least Fig. 3-4 [0022-0036]: system may display a user to engage and talk with baby) raising or lowering sunshades of the vehicle; (see at least Fig. 3-4 [0022-0036]: system may adjust the window control system to tint/lighten the vehicle windows, lower/raise window shade to darken/brighten the car) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Upmanue’528’s vehicle system to incorporate the technique of playing a multimedia file on a rear entertainment system of the vehicle; displaying an image or video of the driver to the occupant via the rear entertainment system; and/or raising or lowering sunshades of the vehicle as taught by Upmanue’803 with reasonable expectation of success to a system and method that meets pediatric recommendations and government regulations for actively monitoring a baby’s physical state within the vehicle to make ensure safety and making the ride more enjoyable/comfortable for both driver and vehicle occupants (Upmanue’803 [0002]). It may be alleged that the combination of Upmanue’528 in view of Upmanue’803 does not explicitly teach activating additional motors in or attached to the rear seat to move the rear seat forward and backward to introduce a rocking motion; Ekchian is directed to system and method for controlling an active vehicle seat based on operator input from a user interface, Ekchian teaches activating additional motors in or attached to the rear seat to move the rear seat forward and backward to introduce a rocking motion; (see at least Abstract [0076-0081]: the user interface may provide a child sleep mode option where the seat rocks back and forth in a traverse or longitudinal direction to aid a child’s sleep.) Examiner notes that Upmanue’528 and Upmanue’803 discloses the technique of presenting suggested/recommended solution for soothing/calming an occupant’s emotion upon determining an occupant is distressed and that Ekchian’s vehicle is equipped with various internal and external sensors capable of detecting and monitoring an occupant’s status. Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528 and Upmanue’803 to provide an active vehicle seat and the technique of offering a driver with a child sleep mode option on user interface that actuates the seat to rocks back and forth to aid a child’s sleep as taught by Ekchian with reasonable expectation of success to enhance the soothing effect provided to the infant. Doing so would have been expected to improve the overall effectiveness of the soothing system by adding a known technique (i.e. rocking motion) that provides calming and comforting effects and such modification could reasonably be expected to benefit driver safety by reducing infant crying or restlessness during vehicle operation which are known to cause driver distraction. Regarding claim 2, the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 1, wherein Upmanue’528 further teaches the one or more occupancy-monitoring sensors comprise one or more microphones (see at least Fig. 1-4 [0003, 0029]: the system may have defined moods that are utilized based on the data collected from various sensors in the vehicle, such as camera and voice recognition system (e.g. microphone).); and the occupant data comprise audio data from the rear seat of the vehicle. (see at least Fig. 1-4 [0003, 0029]: the system may have defined moods that are utilized based on the data collected from various sensors in the vehicle, such as camera and voice recognition system (e.g. microphone). Regarding claim 3, the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 2, Upmanue’528 does not explicitly teach wherein determining whether the occupant is distressed comprises determining, based on the audio data, whether sounds or changes in breathing from the occupant are associated with distress. Upmanue’803 is directed to baby comfort monitoring system and method for a vehicle, Upmanue teaches wherein determining whether the occupant is distressed comprises determining, based on the audio data, whether sounds or changes in breathing from the occupant are associated with distress. (see at least [0014-0029]: system may be programmed to use sound identification to differentiate between whether the sound corresponds to a baby crying, a baby babbling, coughing, or other background sounds. Based on the level or strength of the baby crying or severity of a baby’s cough, system may determine whether the baby is sick or distressed) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528 and Ekchian to incorporate the technique of determining whether the occupant is stressed based on audio data wherein the sounds or changes in breathing indicating distress as taught by Upmanue’803 with reasonable expectation of success to a system and method that meets pediatric recommendations and government regulations for actively monitoring a baby’s physical state within the vehicle to make ensure safety and making the ride more enjoyable/comfortable for both driver and vehicle occupants (Upmanue [0002]). Regarding claim 4, the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 1, Upmanue’528 further teaches wherein: the one or more occupancy-monitoring sensors comprise one or more cameras (see at least Fig. 1-4 [0012-0044]: The in-vehicle camera 103 may be mounted in the vehicle to monitor occupants (e.g. a driver or passenger) within the vehicle cabin. The in-vehicle camera 103 may be part of an occupant status monitoring system (OSM). The in-vehicle camera 103 may be utilized to capture images of an occupant in the vehicle. The in-vehicle camera 103 may obtain facial information about an occupant, such as eye-movement of the occupant and head-movement of the occupant, as discussed further below. The in-vehicle camera may be a color camera, infrared camera, or time of flight camera. The in-vehicle camera 103 may be mounted on a head rest, in the headliner, or located on a mobile device (e.g. tablet or mobile phone) to capture the driver's face, especially the driver's eyes.) and the occupant data comprise camera data of the rear seat of the vehicle. (see at least Fig. 1-4 [0012-0044]: For example, in a rear-seat occupant stress out scenario, the HMI display 115 may display message to the driver that the rear-seat occupant is stressed and need comfort.) Regarding claim 7 (similarly claim 19), the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 1, Upmanue’528 further teach the one or more occupancy-monitoring sensors comprise at least one of an infrared camera, a radar sensor, a time-of-flight camera, a thermographic camera, or an ultrasonic sensor; (see at least Fig. 1-4 [0016]: the in-vehicle camera may be a color camera, infrared camera, or time of flight camera) the occupant data comprise biometric data associated with the occupant, the biometric data including at least one of a body temperature, a breathing pattern, or a heart rate of the occupant; (see at least Fig. 1-4 [0019]: a health scanner maybe mounted on the vehicle seat, child seat, or suitable location which the occupant touches. The health scanner may scan the occupant’s heartbeat to determine a heart rate and the OSM processes data received from the health scanner and monitors whether the occupant is suffering from a severe physical conditions or episode. The OSM may also be utilized with the health scanner to see if various fluctuations in data may identify stress or issues with the occupant. The health scanner may include sensors for blood pressure monitoring, electrocardiogram (EKG/ECG) monitoring, or other body vitals.) and determining whether the occupant is distressed comprises determining, based on the biometric data, whether changes in the biometric data associated with the occupant are associated with distress. (see at least Fig. 1-4 [0019]: a health scanner maybe mounted on the vehicle seat, child seat, or suitable location which the occupant touches. The health scanner may scan the occupant’s heartbeat to determine a heart rate and the OSM processes data received from the health scanner and monitors whether the occupant is suffering from a severe physical conditions or episode. The OSM may also be utilized with the health scanner to see if various fluctuations in data may identify stress or issues with the occupant. The health scanner may include sensors for blood pressure monitoring, electrocardiogram (EKG/ECG) monitoring, or other body vitals.) Regarding claim 8 (Similarly claim 17), the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 1, wherein Upmanue’528 does not explicitly teach providing the occupant data as an input to a machine-learned model, the occupant data comprising at least one of camera data, audio data, or biometric data associated with the occupant; and determining, by the machine-learned model, whether changes in the occupant data are associated with distress. Upmanue’803 is directed to baby comfort monitoring system and method for a vehicle, Upmanue teaches providing the occupant data as an input to a machine-learned model, the occupant data comprising at least one of camera data, audio data, or biometric data associated with the occupant; (see at least [0023-0024]: it is contemplated that system may operate to learn a baby’s physical state and may include a deep neural network or convolution neural network to go through a learning stage to be able to recognize certain physical states to detect when the babies’ temperature, breathing, and heart rate indicating that the baby is sick e.g., with a fever) and determining, by the machine-learned model, whether changes in the occupant data are associated with distress. (see at least [0023-0024]: it is contemplated that system may operate to learn a baby’s physical state and may include a deep neural network or convolution neural network to go through a learning stage to be able to recognize certain physical states to detect when the babies’ temperature, breathing, and heart rate indicate that the baby is sick e.g., with a fever) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528 and Ekchian to incorporate the technique of providing the occupant data as an input to a machine-learned model, the occupant data comprising at least one of camera data, audio data, or biometric data associated with the occupant; and determining, by the machine-learned model, whether changes in the occupant data are associated with distress as taught by Upmanue’803 with reasonable expectation of success to a system and method that meets pediatric recommendations and government regulations for actively monitoring a baby’s physical state within the vehicle to make ensure safety and making the ride more enjoyable/comfortable for both driver and vehicle occupants (Upmanue’803 [0002]). Claim(s) 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Upmanue’528 in view of Upmanue’803, Ekchian and Zhang et al. (US 10,666,901B1 hereinafter Zhang). Regarding claim 5, the combination of Upmanue’528 in view of Upmanue’803 and Ekchian teaches The method of claim 1, wherein It may be alleged that the combination of Upmanue’528 in view of Upmanue’803 and Ekchian does not explicitly teach tracking, using the camera data, positions of one or more key body points of the occupant; and determining whether the occupant is distressed comprises determining whether changes in the positions of the one or more of the key body points are above a movement threshold Zhang is directed to system and method for soothing an occupant in a vehicle, Zhang teaches tracking, using the camera data, positions of one or more key body points of the occupant. (see at least Col. 6 Lines 39-52: the system may utilize a camera to track movement and facial expressions to collect information about the occupant) determining whether the occupant is distressed comprises determining whether changes in the positions of the one or more of the key body points are above a movement threshold. (see at least Fig. 4 Col. 8 Line 59-Col. 11 Line 25: the system may determine the stress load of the occupant utilizing the data collected by the various sensors in the vehicle. The system may adjust the threshold based on various factors in the vehicle, such as an occupant of the car. The system may also allow for automatic adjustment of the threshold that may be set by the user or adjusted via a vehicle interface. As such, the stress load data may be collected and analyzed to measure and compare against the threshold to determine whether an intervention should be applied to occupant based on a stress load of an occupant of a user. Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528, Upmanue’803 and Ekchian to incorporate the technique of tracking, using the camera data, positions of one or more key body points of the occupant and determining whether the occupant is distressed comprises determining whether changes in the positions of the one or more of the key body points are above a movement threshold to determine whether the occupant is sick/distressed as taught by Zhang with reasonable expectation to ensure occupant’s safety and improve ride comfort level for vehicle occupants. Regarding claim 6, the combination of Upmanue’528 in view of Upmanue’803, Ekchian and Zhang teaches The method of claim 5, wherein the combination of Upmanue’528 in view of Upmanue’803, Ekchian does not explicitly teach wherein the key body points include at least one of a head, shoulders, eyes, arms, legs, or hips of the occupant. Zhang is directed to system and method for soothing an occupant in a vehicle, Zhang teaches wherein the key body points include at least one of a head, shoulders, eyes, arms, legs, or hips of the occupant. (see at least Col. 6 Lines 39-52: the system may utilize a camera to track movement and facial expressions to collect information about the occupant) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528, Upmanue’803 and Ekchian to incorporate the technique of tracking, using the camera data, positions of one or more key body points of the occupant to determine whether the occupant is sick/distressed as taught by Zhang with reasonable expectation to ensure occupant’s safety and improve ride comfort level for vehicle occupants. Claim(s) 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Upmanue’528 in view of Upmanue’803, Ekchian and Zhang. Regarding claim 9 (similarly claim 18), the combination of Upmanue’528, Upmanue’803 and Ekchian teaches The method of claim 8, Upmanue’528 does not explicitly teach determining, based on the changes in the occupant data and using the machine-learned model, a probability that the occupant is distressed; and determining whether the probability that the occupant is distressed is greater than a distress threshold, the distress threshold being a configurable confidence value. Upmanue’803 is directed to baby comfort monitoring system and method for a vehicle, Upmanue’803 teaches determining, based on the changes in the occupant data and using the machine-learned model, a probability that the occupant is distressed; (see at least Fig. 3 [0022-0035]: the controller may use data received from the camera and sensors to determine whether the babies’ physical state is within a predefined comfort range/settings/threshold and may use sensor to determine if the baby vitals are indicating that the baby is illustrating signs of being sick or distressed (e.g. based on deviation between temperature difference, heart rate is outside of predefined comfort settings. That is, a probability that the baby is sick/distressed can be determined based on changes in occupant data along with machined learned model) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528 and Ekchian to incorporate the technique of determining, based on the changes in the occupant data and using the machine-learned model, a probability that the occupant is distressed as taught by Upmanue’803 with reasonable expectation of success to a system and method that meets pediatric recommendations and government regulations for actively monitoring a baby’s physical state within the vehicle to make ensure safety and making the ride more enjoyable/comfortable for both driver and vehicle occupants (Upmanue’803 [0002]). The combination of Upmanue’528 in view of Upmanue’803 and Ekchian does not explicitly teach determining whether the probability that the occupant is distressed is greater than a distress threshold, the distress threshold being a configurable confidence value. Zhang is directed to system and method for soothing an occupant in a vehicle, Zhang teaches determining whether the probability that the occupant is distressed is greater than a distress threshold, the distress threshold being a configurable confidence value. (see at least Fig. 2-4 Col. 6 Line 65 – Col. 11 Line 25: determining a stress level of the occupant utilizing data collected by the various sensors in the vehicle and may adjust the threshold based on various factors in the vehicle. The system may also allow for automatic adjustment of the threshold that may be set by the user or adjusted via a vehicle interface. As such, the stress load data may be collected and analyzed to measure and compare against the threshold to determine whether an intervention should be applied to occupant based on a stress load of an occupant of a user. ) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528, Upmanue’803 and Ekchian to incorporate the technique of determining whether the probability that the occupant is distressed is greater than a distress threshold, the distress threshold being a configurable confidence value as taught by Zhang with reasonable expectation to ensure occupant’s safety and improve ride comfort level for vehicle occupants. Claim(s) 10 is rejected under 35 U.S.C. 103 as being unpatentable over Upmanue’528 in view of Upmanue’803, Ekchian and Bostrom et al. (US 2017/0347067 A1 hereinafter Bostrom). Regarding claim 10, the combination of Upmanue’528, Upmanue’803 and Ekchian teaches The method of claim 1, the method further comprises: the combination of Upmanue’528, Upmanue’803 and Ekchian does not explicitly teach before displaying the image or video of the occupant, applying a processing action to the image or video of the occupant, the processing action comprising at least one of: cropping the image or video to focus on the occupant; adjusting a brightness or contrast of the image or video; or adjusting ambient lighting of the vehicle. Bostrom is directed to vehicle display system with selective image data displaying, Bostrom teaches before displaying the image or video of the occupant, applying a processing action to the image or video of the occupant, the processing action comprising at least one of: cropping the image or video to focus on the occupant; (see at least Fig. 1-3 [0017-0030]: the controller may be operable to crop the image data a focusing on the region of interest e.g. identifying a facial region of the passenger or occupant) Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Upmanue’528, Upmanue’803 and Ekchian to incorporate the technique of applying a processing action to crop the image or video to focus on the occupant as taught by Bostrom with reasonable expectation of success such that region of interest can be clearly viewed/seen by the driver. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DANA F ARTIMEZ/Examiner, Art Unit 3667 /FARIS S ALMATRAHI/Supervisory Patent Examiner, Art Unit 3667
Read full office action

Prosecution Timeline

Jun 03, 2022
Application Filed
May 02, 2024
Non-Final Rejection — §103
Jul 24, 2024
Applicant Interview (Telephonic)
Jul 24, 2024
Examiner Interview Summary
Aug 09, 2024
Response Filed
Aug 16, 2024
Final Rejection — §103
Oct 29, 2024
Examiner Interview Summary
Oct 29, 2024
Applicant Interview (Telephonic)
Nov 29, 2024
Request for Continued Examination
Dec 03, 2024
Response after Non-Final Action
Jan 15, 2025
Non-Final Rejection — §103
Apr 29, 2025
Response Filed
Jul 05, 2025
Final Rejection — §103
Mar 30, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596371
SYSTEM AND METHOD FOR INTERCEPTION AND COUNTERING UNMANNED AERIAL VEHICLES (UAVS)
2y 5m to grant Granted Apr 07, 2026
Patent 12573078
METHOD AND APPARATUS FOR DETERMINING VEHICLE LOCATION BASED ON OPTICAL CAMERA COMMUNICATION
2y 5m to grant Granted Mar 10, 2026
Patent 12571646
Automated Discovery and Monitoring of Uncrewed Aerial Vehicle Ground-Support Infrastructure
2y 5m to grant Granted Mar 10, 2026
Patent 12560441
METHOD AND APPARATUS FOR OPTIMIZING A MULTI-STOP TOUR WITH FLEXIBLE MEETING LOCATIONS
2y 5m to grant Granted Feb 24, 2026
Patent 12560936
SYSTEMS AND METHODS FOR OBJECT DETECTION
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
58%
Grant Probability
99%
With Interview (+43.9%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 80 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in for Full Analysis

Enter your email to receive a magic link. No password needed.

Free tier: 3 strategy analyses per month