Prosecution Insights
Last updated: April 18, 2026
Application No. 18/519,203

METHOD AND APPARATUS OF PREDICTING POSSIBILITY OF ACCIDENT IN REAL TIME DURING VEHICLE DRIVING

Final Rejection §103
Filed
Nov 27, 2023
Examiner
KONG, SZE-HON
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
OA Round
2 (Final)
65%
Grant Probability
Favorable
3-4
OA Rounds
3y 4m
To Grant
80%
With Interview

Examiner Intelligence

Grants 65% — above average
65%
Career Allow Rate
392 granted / 603 resolved
+13.0% vs TC avg
Moderate +15% lift
Without
With
+14.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
24 currently pending
Career history
627
Total Applications
across all art units

Statute-Specific Performance

§101
5.8%
-34.2% vs TC avg
§103
55.6%
+15.6% vs TC avg
§102
15.4%
-24.6% vs TC avg
§112
21.8%
-18.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 603 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 1/12/2026 have been fully considered but they are not persuasive. Applicant’s arguments with respect to claim(s) 1-8 and 15-19 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Information Disclosure Statement The information disclosure statement (IDS) submitted on 11/13/2025 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1, 5-8 and 15-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Desies et al. (US 2024/0383497 A1) and Suo et al. (US 2022/0153314 A1). For claims 1 and 15, Desies discloses an apparatus for predicting a possibility of an accident, the apparatus comprising: a processor; an abstraction module executed by the processor and configured to abstract surrounding situation data of an ego-vehicle input from a sensor and movement data of an ego-vehicle input from the sensor to generate abstracted driving situation data (Para. 0003, 0004, 0033, 0044, 0067, where the traffic situation the vehicle is experiencing is determined based on vehicle and environment data); a calculation module executed by the processor and configured to calculate a digitized score of a possibility of an accident of the ego-vehicle, based on the abstracted driving situation data (Para. 0006, 0011, 0017-0021, where score is determined of possible accident of the vehicle based on the situation); and an action generating module executed by the processor and configured to generate action data of the ego-vehicle for decreasing the possibility of the accident, based on the score (Abstract, para. 0012, 0057, where the vehicle is controlled to decrease the possibility of accident based on the score). Desies does not explicitly disclose converting the surrounding situation data into figures representing objects around the ego-vehicle and converting the movement data into time-series data represented on a two- or three-dimensional coordinate system; and a prediction function trained via a long short-term memory (LSTM) neural network and stored in the calculation module as a trained model configured to perform real-time accident-risk inference based on the abstracted driving situation data, and using the trained model to calculate the score. Suo in the same field of the art discloses converting the surrounding situation data into figures representing objects around the ego-vehicle and converting the movement data into time-series data represented on a two- or three-dimensional coordinate system (Fig. 3, 4, 8, 9, abstract, para. 0003, 0011-0013, 0015, 0040, where the surrounding objects are represented graphically and movement data are presented in time-series data in coordinate system); and a prediction function trained via a long short-term memory (LSTM) neural network and stored in the calculation module as a trained model configured to perform real-time accident-risk inference based on the abstracted driving situation data, and using the trained model to calculate the score (Para. 0006, 0120, 0127, where the predication and collision prevention system may utilize machine learned models including LSTM neural network). It would have been obvious for one of ordinary skill in the art before the effective filing date of the present claimed invention to modify the invention of Desies to convert the surrounding situation data into figures representing objects around the ego-vehicle and converting the movement data into time-series data represented on a two- or three-dimensional coordinate system; and a prediction function trained via a long short-term memory (LSTM) neural network and stored in the calculation module as a trained model configured to perform real-time accident-risk inference based on the abstracted driving situation data, and using the trained model to calculate the score, as taught by Suo to utilize known simulation technique for preventing collisions in vehicle controls. For claims 5 and 17, Desies, as modified, discloses the apparatus of claims 1 and 15, wherein the abstracted driving situation data is data implemented in an image form or a table form (Fig. 2). For claim 6, Desies discloses the method of claim 1, wherein the generating of the action data of the ego-vehicle comprises generating the action data of the ego-vehicle for decreasing the possibility of the accident of the ego-vehicle (Abstract, para. 0012, 0057, where the vehicle is controlled to decrease the possibility of accident based on the score). For claim 7, Desies discloses the method of claim 6, wherein the generating of the action data of the ego-vehicle comprises generating the action data of the ego-vehicle, based on guide data for avoiding an accident caused by an accident-causing action of the ego-vehicle (Para. 0012, 0013, 0015, 0025, where action data of the vehicle is generated based on the situation and the data that guide the vehicle to avoid the accident). For claim 8. The method of claim 7, wherein the calculating of the digitized score of the possibility of the accident of the ego-vehicle further comprises providing the guide data (Fig. 1, para. 0007, 0014, 0015, where warning or suggested responses may be issued to avoid an accident). For claim 16, Desies, as modified, discloses the apparatus of claim 15, wherein the abstraction module is configured to express the surrounding situation data of the ego-vehicle as the figures to generate the abstracted driving situation data (Fig. 2). For claim 18, Desies discloses the apparatus of claim 15, wherein the calculation module is further configured to output guide data for avoiding an accident caused by an accident-causing action of the ego-vehicle (Fig. 1, para. 0007, 0014, 0015, where warning or suggested responses may be issued to avoid an accident). For claim 19, Desies discloses the apparatus of claim 18, wherein the action generating module is configured to generate action data of the ego-vehicle, based on the guide data (Para. 0012, 0013, 0015, 0025, where action data of the vehicle is generated based on the situation and the data that guide the vehicle to avoid the accident). Claim(s) 2-4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Desies et al. (US 2024/0383497 A1) and Suo et al. (US 2022/0153314 A1), as applied to claim 1 above, and further in view of Mortazavi et al. (US 2020/0269877 A1). For claim 2, Desies, as modified, discloses the method of claim 1, wherein the generating of the abstracted driving situation data comprises: generating abstracted environment data where the surrounding situation data of the ego-vehicle is expressed as the figures (Fig. 2); generating abstracted action data where the movement data of the ego-vehicle is expressed graphically (Fig. 2); and generating the driving situation data configured to include the abstracted environment data and the abstracted action data (Fig. 2), but does not explicitly disclose the movement data is expressed as the time-series data represented on the two- or three-dimensional coordinate system. However, it is conventional in the art to express vehicle movements and trajectories in coordinate system to present vehicle movements relative to objects in the environment for better gauging the scene of the traffic situation. Even so, Suo, as discussed above in claim 1 discloses the movement data being expressed as the time-series data represented on the two- or three-dimensional coordinate system. Further, Mortazavi in the same field of the art discloses the known vehicle movements in coordinate system (Fig. 3, 13, para. 0002, 0062, 0067, 0110, 0111, where vehicle movements and trajectories are presented in coordinate system). It would have been obvious for one of ordinary skill in the art before the effective filing date of the present claimed invention to incorporate the teachings of Mortazavi to express as the time-series data represented on the two- or three-dimensional coordinate system, as taught by Mortazavi to provide a clear and better situation analysis relate to the traffic. For claim 3, Desies, as modified, discloses the method of claim 2, wherein the figure comprises a line, a triangle, a tetragon (Desies - Fig. 2), dots, and a circle (Mortazavi - Fig. 3, 13). For claim 4, Desies, as modified, discloses the method of claim 2, wherein the coordinate system is a two-dimensional (2D) coordinate system including an x axis representing first movement data of the ego-vehicle and a y axis representing second movement data of the ego-vehicle (Mortazavi – Fig. 3, 11, 13). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. (12,202,518) Ng et al. discloses an autonomous navigation system prevent collision using predicted movement of objects in time-series in 3D coordinate system. (US 2021/0389769 A1) Hari et al. discloses driving scenarios generation for autonomous vehicles presenting vehicle environments, movements and trajectories. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Sze-Hon Kong whose telephone number is (571)270-1503. The examiner can normally be reached 9 AM-5 PM Mon-Fri. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached at (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SZE-HON KONG/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Nov 27, 2023
Application Filed
Oct 16, 2025
Non-Final Rejection — §103
Jan 12, 2026
Response Filed
Apr 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589739
METHOD FOR MONITORING A LANE CHANGE OF A VEHICLE
2y 5m to grant Granted Mar 31, 2026
Patent 12583449
DRIVER ASSISTANCE APPARATUS AND METHOD FOR MAINTAINING AND COMPENSATING BRAKING TORQUE IN ADAPTIVE CRUISE CONTROL
2y 5m to grant Granted Mar 24, 2026
Patent 12583461
SYSTEMS AND METHODS FOR DETECTING DRIVER BEHAVIOR
2y 5m to grant Granted Mar 24, 2026
Patent 12576818
Method and System for Enhanced Braking in a Tractor Unit
2y 5m to grant Granted Mar 17, 2026
Patent 12576846
VEHICLE INCREMENTAL MOVEMENT SYSTEM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
65%
Grant Probability
80%
With Interview (+14.8%)
3y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 603 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month