Prosecution Insights
Last updated: April 19, 2026
Application No. 18/411,163

AUTONOMOUS MANEUVERING OF VEHICLE INTO SERVICE STATION

Final Rejection §101§103§112
Filed
Jan 12, 2024
Examiner
KIM, ANDREW SANG
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
VALEO SCHALTER UND SENSOREN GMBH
OA Round
2 (Final)
83%
Grant Probability
Favorable
3-4
OA Rounds
2y 6m
To Grant
87%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
146 granted / 175 resolved
+31.4% vs TC avg
Minimal +4% lift
Without
With
+3.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
22 currently pending
Career history
197
Total Applications
across all art units

Statute-Specific Performance

§101
12.3%
-27.7% vs TC avg
§103
44.9%
+4.9% vs TC avg
§102
14.7%
-25.3% vs TC avg
§112
22.2%
-17.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION This Office Action is in response to Applicant’s Amendment and Remarks filed on 10/02/2025. Claims 1, 3-8, 10-15 and 17-20 received on 10/26/2021 are considered in this Office Action. Claims 2, 9 and 16 are cancelled. Claims 1, 3-8, 10-15 and 17-20 are pending for examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Claims 2-3, 9-10 and 16-17 were previously rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. In response to the Applicant’s amendment, the rejection has been withdrawn. Claims 15-20 were previously rejected under 35 U.S.C. 101 because the claimed invention is directed to a non-statutory subject matter. The Examiner’s draws the Applicant’s attention to the dependent claims 17-20 which is directed to the computer-readable storage medium, which lacks insufficient antecedent basis and thus is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter. In response to the amendment to claim 15 and interpretation made by the Examiner under the 35 U.S.C. 112(b) rejection for dependent claims 17-20, the rejection is withdrawn. Applicant’s arguments with respect to independent claim(s) 1, 8 and 15 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 17-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 17-20 recite the limitation "The computer-readable storage medium". There is insufficient antecedent basis for this limitation in the claim. For examination purposes, the Examiner will interpret “The computer-readable storage medium of claim” as “The non-transitory computer-readable storage medium” Applicant’s arguments with respect to claim(s) 1, 8 and 15 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cunningham (US 20240411312 A1), in view of An (US 20230311893 A1), and further in view of SUZUKI (US 20210237722 A1). Regarding claim 8, Cunningham teaches a system for autonomously maneuvering a vehicle into a service station (FIG. 1; FIG. 2), the system comprising: a plurality of cameras mounted to the vehicle and configured to capture images of an environment outside the vehicle (FIG. 1; para. [0041]-[0042]: “In various embodiments, the sensor array 32 includes sensors for measuring observable conditions of the vehicle 10, including of the environment surrounding the vehicle 10, and generating sensor data based thereon. The sensing devices in the sensor array 32 may include, but are not limited to, radars, lidars, global positioning systems, optical cameras, […] The sensor array 32 includes one or more environment sensors 36. In various embodiments, the environment sensors 36 are disposed to monitor the surroundings of the vehicle 10 such that various temperatures, positions, characteristics, distances, images, and other observable parameters are measured”); a processor communicatively coupled to the plurality of cameras (FIG. 1; para. [0043]: “the controller 30 is coupled with the sensor array 32 and provides instructions for controlling the vehicle 10”); and memory having instructions that, when executed by the processor, cause the processor to (para. [0049]: “program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 42) to perform and execute the program”): execute a (para. [0050]: “Thereafter, the processor 42 may establish correlations between the stored values 56 and the sensed environment to assign attributes for object detection, object classification, and the like. The resulting objects and their classification may be used in operating the vehicle 10, which, in turn, influences commands generated or otherwise provided by the processor 42 to control actuators of the vehicle 10, such as for steering, braking, acceleration, and for other functions, via the actuator system 22. As the data is captured or generated it is logged and may be stored in the storage device 50, or in other devices of the vehicle 10”) and determine a presence of at least one rail located in the environment (FIG. 2; FIG. 3 206; para. [0058]: “Various systems that capture and identify aspects of the carwash system 100 may be employed to capture 206 scenes, whether as a series of images or otherwise, for analysis on the processor 42 chip to identify the objects of interest such as the entry opening 104, the rails 106, the conveyor 108, the brushes 110, the gantry 112, the signal 114, and/or other attributes/features”); receive map data via a Global Positioning System (GPS) (para. [0002]: “These vehicles may use information from global positioning systems (GPS) technology, navigation systems, vehicle-to-vehicle communication, vehicle-to-infrastructure technology, and/or drive-by-wire systems to determine actions, such as to navigate”; para. [0056]: “the location of the carwash system 100 may be entered into a navigation system of the vehicle 10 and upon arrival, the carwash system 100 may be identified by the controller 30 and the start 202 fulfilled. In some embodiments, when the vehicle 10 is autonomously operated the vehicle 10 is aware of its location and the start 202 is fulfilled […] the start 202 may be accomplished by other methods, such as by using GPS location data”, wherein location of the carwash and vehicle are used indicate receive map data via a Global Positioning System (GPS)); determine that the vehicle is located at the service station based upon at least the determined presence of the at least one rail (FIG. 2; para. [0056]: “the method 200 identifies the presence of the carwash system 100. The vehicle 10 may use any part of the sensor array 32 whether in camera, radar, lidar, ultrasonic or other form for the identification.”); determine that the vehicle is located at the service station based upon the map data (para. [0056]: “the method 200 starts 202 when the vehicle 10 approaches a carwash entrance, such as of the carwash system 100. In some embodiments, the location of the carwash system 100 may be entered into a navigation system of the vehicle 10 and upon arrival, the carwash system 100 may be identified by the controller 30 and the start 202 fulfilled. In some embodiments, when the vehicle 10 is autonomously operated the vehicle 10 is aware of its location and the start 202 is fulfilled […] the start 202 may be accomplished by other methods, such as by using GPS location data”, wherein the vehicle determines that it is at the service station based on its GPS location data and map data); generate, on a vehicle user interface, an interactive display providing an operator of the vehicle with an option to command the vehicle to activate a low-speed maneuvering system, wherein the interactive display is generated in response to the determination that the vehicle is located near the service station (FIG. 3; para. [0059]: “Following the capture 206 of the features of the carwash system 100, the controller 30, […] method 200 proceeds to prompt 212, such as through the interface 48, the driver to confirm engagement of the carwash detection and entry assistance system 40 by the controller 30”; para. [0046]: “the interface 48 includes a display screen”, wherein entry assistance system corresponds to an activation of low-speed maneuvering system), and output control signals to autonomously maneuver the vehicle into the service station in response to the low-speed maneuvering system being activated, wherein the control signals cause the vehicle to maneuver relative to a location of the at least one rail as determined by (FIG. 3; para. [0062]: “the method 200 determines 218 the path into the carwash system 100 including approach and alignment. For example, the controller 30 computes the steering angle(s) to result in placing the wheels of the vehicle 10 on the conveyor 108. In other embodiments, the controller 30 computes the steering angle to center the vehicle 10 withing the gantry 112. In other embodiments, other attributes/features of the carwash system 100 may be used to compute a path for the vehicle 10 into the carwash system 100.”; para. [0068]: “The vehicle 10 autonomously operates and enters 230 the carwash system 100. The controller 30 operates the systems of the vehicle 10, including steering, accelerator, and braking, to move the vehicle 10 into the carwash system 100 along the determined 218 path using the actuator system 22. As part of the entry 230 operation, the controller 30, via the sensor array 32 may monitor the distance to the attributes/features of the carwash system 100 and may monitor the signal(s) 114 to stop the vehicle 10 at the proper location for the carwash system 100.”; para. [0021]: “The controller via an actuator system, operates components of the vehicle. The controller determines the path based on the location of the rail. The controller via the actuator system, moves the vehicle into the wash facility along the path”), but fails to specifically teach confirm the determination that the vehicle is located at the service station based upon the map data and computer vision (CV) machine-learning model to process the images. However, in the same field of endeavor, AN teaches determining that the vehicle is located at the service station based upon at least the determined presence of the at least one feature (para. [0071]: “the controller 102 may determine whether the vehicle enters the car wash through analysis of an image captured by the camera 140. The car wash is equipped with a unique facility for automatic car washing of a vehicle, a registered name, a notice (character string), etc. The unique facility may be, for example, in the shape of a tunnel (or gantry) through which a vehicle is washed while passing therethrough.”); confirming the determination that the vehicle is located at the service station based upon the map data (para. [0054]: “the controller 102 may determine whether the vehicle enters the car wash by using current location guidance information of the navigation device 130. That is, when the information on the current location of the vehicle provided by the navigation device 130 points to a place registered as the car wash, the controller 102 may determine that the vehicle is currently located in the car wash”; para. [0057]: “The controller 102 may determine whether the vehicle enters the car wash using all of the navigation device 130, the camera 140, and the RF communication device 150, using only any two of them, or using only any one of them.”, wherein “The controller 102 may determine whether the vehicle enters the car wash using any two of them” or the embodiment which uses the camera and navigation device to determine whether the vehicle is at the car wash indicates confirming the determination that the vehicle is located at the service station based upon the map data, as the current location will also be checked when the camera determines that the vehicle is located at the service station, thus comprising of the confirmation step); generating, on a vehicle user interface (FIG. 6). Cunningham and An are analogous to the claimed invention because it pertains to the providing assistance to the driver when detecting a carwash facility. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify car wash detection of Cunningham and incorporate the teaching of An and use both the camera and navigation information to determine whether the vehicle is at the car wash. Doing so will optimize detection that the vehicle is at a car wash by requiring various detection methods. Cunningham in view of An fails to specifically teach computer vision (CV) machine-learning model to process the images. However, in the same field of endeavor, SUZUKI teaches computer vision (CV) machine-learning model to process the images (para. [0064]: “The environment information is used for autonomous driving control. In the present embodiment, sensors for perception 260 include a camera that captures an image around vehicle 1 (including its front and rear sides) […] ADC computer 210 can recognize, for example, a person, an object (e.g., another vehicle, a pole, a guard rail and the like), and a line (e.g., a center line) on a road that are present in a range perceivable from vehicle 1 by using environment information received from sensors for perception 260. Artificial intelligence (AI) or an image processing processor may be used for recognition” , wherein CV machine learning model is a form of AI for image processing) and the at least one rail as determined by the CV machine learning model (para. [0064]: “ADC computer 210 can recognize, for example, a person, an object (e.g., another vehicle, a pole, a guard rail and the like), and a line (e.g., a center line) on a road that are present in a range perceivable from vehicle 1 by using environment information received from sensors for perception 260. Artificial intelligence (AI) or an image processing processor may be used for recognition”). SUZUKI is considered analogous to the claimed invention because it is reasonably pertinent to the problem of processing images to identify features from image/sensor data. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the image processing algorithm of Cunningham in view of An with the image processing based on artificial intelligence (AI) of SUZUKI, because they both perform the function of processing environmental data such as images, and one could have substituted the mechanisms and the result of the substitution would have been predictable in classifying objects detected by the sensor, thus recognizing features related to carwash facility, such as a rail. Regarding claim 10, Cunningham in view An and further in view of SUZUKI teaches the system of claim 8. Cunningham further teaches wherein the map data includes a label of a location of the service station as being associated with the service station (FIG. 2; para. [0056]: “In the current embodiment, the method 200 starts 202 when the vehicle 10 approaches a carwash entrance, such as of the carwash system 100. In some embodiments, the location of the carwash system 100 may be entered into a navigation system of the vehicle 10 and upon arrival, the carwash system 100 may be identified by the controller 30 and the start 202 fulfilled. In some embodiments, when the vehicle 10 is autonomously operated the vehicle 10 is aware of its location and the start 202 is fulfilled. […] In other embodiments the start 202 may be accomplished by other methods, such as by using GPS location data.”; para. [0051]: “The control system 28 processes sensor data along with other data to determine a position (e.g., a local position relative to a map,”, wherein determining that the vehicle approaches a carwash entrance based on GPS location data, navigation system and map indicates map data includes a label of a nearby location as being a type of service station). Regarding claim 11, Cunningham in view An and further in view of SUZUKI teaches the system of claim 8. Cunningham further teaches wherein the service station is a car wash (FIG. 2; para. [0052]: “Referring to FIG. 2 along with FIG. 1, in accordance with various embodiments the vehicle 10 may be presented with a scenario for entering a wash facility designated as a carwash system 100”; para. [0052]: “Referring to FIG. 2 along with FIG. 1, in accordance with various embodiments the vehicle 10 may be presented with a scenario for entering a wash facility designated as a carwash system 100”; para. [0056]: “the method 200 identifies the presence of the carwash system 100”). Regarding claim 12, Cunningham in view An and further in view of SUZUKI teaches the system of claim 8. Cunningham further teaches wherein the control signals include acceleration commands and steering commands to autonomously maneuver one or more wheels of the vehicle to align with the at least one rail (Cunningham para. [0054]: “Inside the walls 102, the car wash system may have one or more rails 106 for guiding the wheels of the vehicle 10.”; Cunningham para. [0021]: “In additional embodiments, the controller via a sensor system, senses a location of a rail of the wash facility. The controller via an actuator system, operates components of the vehicle. The controller determines the path based on the location of the rail. The controller via the actuator system, moves the vehicle into the wash facility along the path”; Cunninghanm para. [0064]: “At this step, there are options for proceeding. When the operator of the vehicle 10 has configured, such as through the interface 48, the vehicle carwash detection and entry assistance system 40 for automatic steering and acceleration of the vehicle 10”; Cunninghanm para. [0068]: “The vehicle 10 autonomously operates and enters 230 the carwash system 100. The controller 30 operates the systems of the vehicle 10, including steering, accelerator, and braking, to move the vehicle 10 into the carwash system 100 along the determined 218 path using the actuator system 22.”, “determines the path based on the location of the rail” indicates vehicle to align with the at least one rail). Regarding claim 13, Cunningham in view An and further in view of SUZUKI teaches the system of claim 8. Cunningham further teaches wherein the memory has further instructions that, when executed by the processor, cause the processor to (para. [0049]: “non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 42) to perform”): confirm, via a prompt on the user interface, that the operator desires to activate the low-speed maneuvering system (para. [0059]: “the method 200 proceeds to prompt 212, such as through the interface 48, the driver to confirm engagement of the carwash detection and entry assistance system 40 by the controller 30. The method 200 determines 214 whether the driver has confirmed engagement. When the determination 214 is negative, meaning the driver has declines assistance or entered manual operation without entry assistance, the method 200 ends 216 and the driver controls the vehicle 10 without assistance, and no or limited assistance is provided by the controller 30 to the operator”); wherein the control signals are output only after receiving confirmation from the operator (FIG. 3; para. [0060]: “When the determination 214 is positive, meaning the operator has accepted an automatic carwash entry assistance mode of operation by the controller 30, the method 200 proceeds to prepare 215 the vehicle 10 for entry to the carwash system 100.”; para. [0062]: “the method 200 determines 218 the path into the carwash system 100 including approach and alignment.”; para. [0068]: “The vehicle 10 autonomously operates and enters 230 the carwash system 100. The controller 30 operates the systems of the vehicle 10, including steering, accelerator, and braking, to move the vehicle 10 into the carwash system 100 along the determined 218 path using the actuator system 22.”; para. [0064]: “At this step, there are options for proceeding. When the operator of the vehicle 10 has configured, such as through the interface 48, the vehicle carwash detection and entry assistance system 40 for automatic steering and acceleration of the vehicle 10”). Regarding claim 14, Cunningham in view An and further in view of SUZUKI teaches the system of claim 13. Cunningham further teaches wherein the prompt includes instructions for the operator to release a brake pedal of the vehicle and release a steering wheel of the vehicle (Cunningham FIG. 3; para. [0064]: “At this step, there are options for proceeding. When the operator of the vehicle 10 has configured, such as through the interface 48, the vehicle carwash detection and entry assistance system 40 for automatic steering and acceleration of the vehicle 10, the controller 30 prompts 224 the operator through the interface 48 to remove hands from the steering system (steering wheel) and refrain from apply pressure to the accelerator and brake pedals.”). Regarding claim 1, it recites a method claim reciting claim limitations similar to those performed by the system of claim 8, and therefore is rejected on the same basis. Regarding claim 3, it recites a method claim reciting claim limitations similar to those performed by the system of claim 10, and therefore is rejected on the same basis. Regarding claim 4, it recites a method claim reciting claim limitations similar to those performed by the system of claim 11, and therefore is rejected on the same basis. Regarding claim 5, it recites a method claim reciting claim limitations similar to those performed by the system of claim 12, and therefore is rejected on the same basis. Regarding claim 6, it recites a method claim reciting claim limitations similar to those performed by the system of claim 13, and therefore is rejected on the same basis. Regarding claim 7, it recites a method claim reciting claim limitations similar to those performed by the system of claim 14, and therefore is rejected on the same basis. Regarding claim 15, it recites a non-transitory computer-readable storage medium containing instructions that, when executed by one or more processors, cause the one or more processors to (Cunningham para. [0049]: “program product with one or more types of non-transitory computer-readable signal bearing media used to store the program and the instructions thereof and carry out the distribution thereof, such as a non-transitory computer readable medium bearing the program and containing computer instructions stored therein for causing a computer processor (such as the processor 42) to perform and execute the program”) perform claim limitations similar to those performed by the system of claim 8, and therefore is rejected on the same basis. Regarding claim 17, it recites a computer-readable storage medium containing instructions to further cause the one or more processors to perform claim limitations similar to those performed by the system of claim 10, and therefore is rejected on the same basis. Regarding claim 18, Cunningham in view An and further in view of SUZUKI teaches the computer-readable storage medium of claim 15. The combination of Cunningham and SUZUKI further teaches wherein the autonomous driving commands include at least one of an acceleration command and a steering command to autonomously maneuver one or more wheels of the vehicle to align with the at least one rail based upon the CV machine-learning model (Cunningham para. [0054]: “Inside the walls 102, the car wash system may have one or more rails 106 for guiding the wheels of the vehicle 10.”; Cunningham para. [0021]: “In additional embodiments, the controller via a sensor system, senses a location of a rail of the wash facility. The controller via an actuator system, operates components of the vehicle. The controller determines the path based on the location of the rail. The controller via the actuator system, moves the vehicle into the wash facility along the path”; Cunninghanm para. [0068]: “The vehicle 10 autonomously operates and enters 230 the carwash system 100. The controller 30 operates the systems of the vehicle 10, including steering, accelerator, and braking, to move the vehicle 10 into the carwash system 100 along the determined 218 path using the actuator system 22.”; SUZUKI para. [0064]: “ADC computer 210 can recognize, for example, a person, an object (e.g., another vehicle, a pole, a guard rail and the like), and a line (e.g., a center line) on a road that are present in a range perceivable from vehicle 1 by using environment information received from sensors for perception 260. Artificial intelligence (AI) or an image processing processor may be used for recognition”, wherein AI for image processing indicates a CV machine-learning model, and is may be used to detect rails, hence the aligning is performed at least with the CV machine-learning model). Regarding claim 19, it recites a computer-readable storage medium containing instructions to further cause the one or more processors to perform claim limitations similar to those performed by the system of claim 13, and therefore is rejected on the same basis. Regarding claim 20, it recites a computer-readable storage medium containing instructions to further cause the one or more processors to perform claim limitations similar to those performed by the system of claim 14, and therefore is rejected on the same basis. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. AKANUMA (US20240416944A1) teaches using both the image data and position data to determine the presence of car wash. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW S KIM whose telephone number is (571)272-7356. The examiner can normally be reached Mon - Fri 8AM - 5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached at (571) 270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.S.K./Examiner, Art Unit 3668 /JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Jan 12, 2024
Application Filed
Jun 27, 2025
Non-Final Rejection — §101, §103, §112
Oct 01, 2025
Response Filed
Oct 14, 2025
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594949
NOTIFICATION DEVICE, NOTIFICATION METHOD, AND NONTRANSITORY RECORDING MEDIUM PROVIDED WITH COMPUTER PROGRAM FOR NOTIFICATION DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12594940
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12589725
VEHICLE AND CONTROL METHOD FOR DETERMINING AN EMERGENCY SITUATION
2y 5m to grant Granted Mar 31, 2026
Patent 12583487
APPARATUS FOR CONTROLLING AUTONOMOUS DRIVING AND METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12565331
FIRE DETECTION SYSTEM AND METHOD FOR MONITORING AN AIRCRAFT COMPARTMENT AND SUPPORTING A COCKPIT CREW WITH TAKING REMEDIAL ACTION IN CASE OF A FIRE ALARM
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
83%
Grant Probability
87%
With Interview (+3.8%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month