Prosecution Insights
Last updated: April 19, 2026
Application No. 17/563,072

METHOD AND ELECTRONIC APPARATUS FOR PREDICTING PATH BASED ON OBJECT INTERACTION RELATIONSHIP

Final Rejection §103
Filed
Dec 28, 2021
Examiner
BEAN, JARED C
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Industrial Technology Research Institute
OA Round
6 (Final)
63%
Grant Probability
Moderate
7-8
OA Rounds
2y 12m
To Grant
99%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
74 granted / 118 resolved
+10.7% vs TC avg
Strong +39% interview lift
Without
With
+38.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 12m
Avg Prosecution
33 currently pending
Career history
151
Total Applications
across all art units

Statute-Specific Performance

§101
15.9%
-24.1% vs TC avg
§103
61.4%
+21.4% vs TC avg
§102
11.6%
-28.4% vs TC avg
§112
7.7%
-32.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 118 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This final rejection is in response to Applicant’s amended filing of 10/22/2025. Claims 1, 3, 5, 7-11, 13, 15, and 17-20 are currently pending and have been examined. Applicant has amended claims 1 and 11. Response to Arguments Applicant’s arguments with respect to claims 1, 3, 5, 7-11, 13, 15, and 17-20 rejected under 35 USC § 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 7, 9-11, 17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Yan et al. (US 20190367020 A1) in view of Schmueli Friedland et al. (US 20220189307 A1) and Petroff et al. (US 20210061269 A1). Regarding claim 1, Yan discloses a method for predicting a path based on an object interaction relationship, adapted for an electronic apparatus comprising a processor, wherein the electronic apparatus is configured to control a first vehicle (see at least abstract and ¶ [0018-0019] disclosing an in-vehicle control system, proximate vehicle intention prediction module, and image processing module), and the method comprises: receiving a video comprising a plurality of image frames (see at least [0018] and [0043] disclose an image processing module receiving image frames for predicting behavior); performing object recognition on a certain image frame among the plurality of image frames to recognize at least one object in the certain image frame (see at least ¶ [0057] where the proximate vehicle intention prediction module performs image analysis to identify objects); obtaining preset interactive relationship information associated with the at least one object from an interactive relationship database based on the at least one object (see at least ¶ [0046-0047], [0059-0060], and [0063] describing training data being collected by a training data collection system database to help inform the vehicle of potential maneuvers in anticipation of a detected object’s trajectory); determining a first trajectory for navigating the first vehicle based on the preset interactive relationship information (see at least ¶ [0018-20], [0030-0031], [0059-0060], and [0063] where the in-vehicle control system operates steering and braking components based on data received by the trained proximate vehicle intention prediction system of detected objects and their corresponding trajectories and/or maneuvers); and controlling, by the processor, the first vehicle to move according to the first trajectory (see at least ¶ [0018-20], [0024], [0030-0031], [0059-0060], and [0063] where the in-vehicle control system operates steering and braking components), wherein determining the first trajectory for navigating the first vehicle based on the preset interactive relationship information comprises: determining the first trajectory of the first vehicle based on the predicted trajectory (see at least ¶ [0038] where a computing system controls the steering of the vehicle in response to determining the trajectory of an obstacle in order to avoid it). While Yan discloses generating a predicted trajectory of a predicted object based on the preset interactive relationship information (see at least ¶ [0052], [0059-0060], and [0063] describing training data being collected by a training data collection system database to help inform the vehicle of potential maneuvers in anticipation of a detected object’s trajectory), it does not explicitly disclose wherein generating the predicted trajectory of the predicted object based on the preset interactive relationship information comprises: in response to determining that the preset interactive relationship information comprises a first type of object interactive relationship, obtaining a preset object that does not appear in a current image frame as the predicted object from the interactive relationship database based on at least one recognized object, wherein the preset object is obtained from the preset interactive relationship information of the at least one recognized object, and the preset object is preset to interactive with the at least one recognized object; and calculating the predicted trajectory of the predicted object based on the-re-set first type of object interactive relationship and a trajectory of the at least one recognized object, wherein the first type of object interactive relationship indicates an interactive relationship between the at least one recognized object and the preset object that does not appear in the current image frame. However, Schmueli Friedland suggests in response to determining that the preset interactive relationship information comprises a first type of object interactive relationship, obtaining a preset object that does not appear in a current image frame as the predicted object from the interactive relationship database based on at least one recognized object, wherein the preset object is obtained from the preset interactive relationship information of the at least one recognized object, and the preset object is preset to interactive with the at least one recognized object (see at least abstract, ¶ [0136-0138], and Figs. 11-12 disclosing observing objects in an image frame, including certain objects and speculative objects associated with the certain object, and predicted their respective trajectories); and calculating the predicted trajectory of the predicted object based on the first type of object interactive relationship and a trajectory of the at least one recognized object, wherein the first type of object interactive relationship indicates an interactive relationship between the at least one recognized object and the preset object that does not appear in the current image frame (see at least abstract, ¶ [0136-0138], and Figs. 11-12 disclosing observing objects in an image frame, including certain objects and speculative objects associated with the certain object, and predicted their respective trajectories). It would be obvious to one of ordinary skill in the art before the effective filing of the present invention to incorporate the certain and speculative objects of Schmueli Friedland with the trajectory prediction system of Yan with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would help the vehicle more accurately determine the trajectory of both seen and unseen obstacles by retrieving information on the type of object approaching the vehicle, thereby better helping the vehicle avoid colliding the object. The combination of Yan and Schmueli Friedland does not disclose wherein determining the first trajectory of the first vehicle based on the predicted trajectory comprises: calculating a predicted collision time based on the predicted trajectory of the predicted object which is a virtual moving object not detected by a sensor of the first vehicle and based on an original target trajectory of the first vehicle, and adjusting the original target trajectory of the first vehicle based on the predicted collision time to generate the first trajectory of the first vehicle. However, Petroff suggests calculating a predicted collision time based on the predicted trajectory of the predicted object which is a virtual moving object not detected by a sensor of the first vehicle and based on an original target trajectory of the first vehicle, and adjusting the original target trajectory of the first vehicle based on the predicted collision time to generate the first trajectory of the first vehicle (see at least abstract and ¶ [0039-0045] and Figs 5A-7 disclosing an autonomous vehicle processor detecting an occlusion close to an intersection and generating a phantom obstacle that it predicts its speed to determine conflict zone in the intersection and timing to the conflict zone where the autonomous vehicle may collide with the phantom obstacle). While Petroff does not explicitly suggest its phantom obstacle is associated with a detected obstacle, it does preempt potential collisions from unseen obstacles that are occluded by the environment and would be likely present given proximity to an intersection. Therefore it would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the collision timing of Petroff with the combination of Yan and Schmueli Friedland with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would allow the system to gauge how much of an immediate threat the speculative object is by timing its approach and avoiding a collision before the predicted collision time. Regarding claims 7 and 17, the combination of Yan and Schmueli Friedland does not disclose adjusting a driving velocity of the first vehicle in the original target trajectory based on the predicted collision time to generate the first trajectory. However, Petroff teaches adjusting a driving velocity of the first vehicle in the original target trajectory based on the predicted collision time to generate the first trajectory (see at least abstract and ¶ [0039-0045] and Figs 5A-7 disclosing an autonomous vehicle processor detecting an occlusion close to an intersection and generating a phantom obstacle that it predicts its speed to determine conflict zone in the intersection and timing to the conflict zone where the autonomous vehicle may collide with the phantom obstacle, and consequently accelerating or decelerating the vehicle in response). While Petroff does not explicitly suggest its phantom obstacle is associated with a detected obstacle, it does preempt potential collisions from unseen obstacles that are occluded by the environment and would be likely present given proximity to an intersection. Therefore it would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the collision timing of Petroff with the combination of Yan and Schmueli Friedland with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would allow the system to gauge how much of an immediate threat the speculative object is by timing its approach and avoiding a collision before the predicted collision time. Regarding claims 9 and 19, Yan discloses performing an image recognition operation to recognize an object feature value of the predicted object (see at least ¶ [0029] and [0042] where the proximate vehicle intention prediction system extracts feature information of an object and/or proximate vehicle, including its speed, pose, heading, orientation, acceleration, etc.); and calculating the predicted trajectory of the predicted object based on the object feature value (see at least ¶ [0042] where the proximate vehicle intention prediction system predicts a target vehicle’s trajectory through features including its speed, pose, heading, orientation, acceleration, etc.). Regarding claims 10 and 20, Yan discloses obtaining lane geometry information from an environment information database based on positioning data of the first vehicle (see at least ¶ [0057] disclosing that image analysis includes roadway lanes in the environment of the host vehicle and how it is used to predict trajectories of proximate vehicles); and calculating the predicted trajectory of the predicted object based on the lane geometry information (see at least ¶ [0057] disclosing that image analysis includes roadway lanes in the environment of the host vehicle and how it is used to predict trajectories of proximate vehicles). Regarding claim 11, Yan discloses an electronic apparatus, adapted for controlling a first vehicle (see at least abstract and ¶ [0018-0019] disclosing an in-vehicle control system, proximate vehicle intention prediction module, and image processing module), wherein the electronic apparatus comprises: a storage device, storing an interactive relationship database (see at least abstract and ¶ [0018-0019], [0035], and [0059-0060] disclosing a storage device used to store maneuver training data); and a processor, coupled to the storage device (see at least abstract and ¶ [0018-0019] and [0035] disclosing an in-vehicle control system, proximate vehicle intention prediction module, and image processing module connected to a storage device), wherein the processor is configured to: receive a video comprising a plurality of image frames (see at least [0018] and [0043] disclose an image processing module receiving image frames for predicting behavior); perform object recognition on a certain image frame among the plurality of image frames to recognize at least one object in the certain image frame (see at least ¶ [0057] where the proximate vehicle intention prediction module performs image analysis to identify objects); obtain preset interactive relationship information associated with the at least one object from the interactive relationship database based on the at least one object (see at least ¶ [0046-0047], [0059-0060], and [0063] describing training data being collected by a training data collection system database to help inform the vehicle of potential maneuvers in anticipation of a detected object’s trajectory); determine a first trajectory for navigating the first vehicle based on the preset interactive relationship information (see at least ¶ [0018-20], [0030-0031], [0059-0060], and [0063] where the in-vehicle control system operates steering and braking components based on data received by the trained proximate vehicle intention prediction system of detected objects and their corresponding trajectories and/or maneuvers); and control the first vehicle to move according to the first trajectory (see at least ¶ [0018-20], [0024], [0030-0031], [0059-0060], and [0063] where the in-vehicle control system operates steering and braking components), wherein determining the first trajectory for navigating the first vehicle based on the preset interactive relationship information comprises: determining the first trajectory of the first vehicle based on the predicted trajectory (see at least ¶ [0038] where a computing system controls the steering of the vehicle in response to determining the trajectory of an obstacle in order to avoid it). While Yan discloses generating a predicted trajectory of a predicted object based on the preset interactive relationship information (see at least ¶ [0052], [0059-0060], and [0063] describing training data being collected by a training data collection system database to help inform the vehicle of potential maneuvers in anticipation of a detected object’s trajectory), it does not explicitly disclose wherein generating the predicted trajectory of the predicted object based on the preset interactive relationship information comprises: in response to determining that the preset interactive relationship information comprises a first type of object interactive relationship, obtaining a preset object that does not appear in a current image frame as the predicted object from the interactive relationship database based on at least one recognized object, wherein the preset object is obtained from the preset interactive relationship information of the at least one recognized object, and the preset object is preset to interactive with the at least one recognized object; and calculating the predicted trajectory of the predicted object based on the-re-set first type of object interactive relationship and a trajectory of the at least one recognized object, wherein the first type of object interactive relationship indicates an interactive relationship between the at least one recognized object and the preset object that does not appear in the current image frame. However, Schmueli Friedland suggests in response to determining that the preset interactive relationship information comprises a first type of object interactive relationship, obtaining a preset object that does not appear in a current image frame as the predicted object from the interactive relationship database based on at least one recognized object, wherein the preset object is obtained from the preset interactive relationship information of the at least one recognized object, and the preset object is preset to interactive with the at least one recognized object (see at least abstract, ¶ [0136-0138], and Figs. 11-12 disclosing observing objects in an image frame, including certain objects and speculative objects associated with the certain object, and predicted their respective trajectories); and calculating the predicted trajectory of the predicted object based on the-re-set first type of object interactive relationship and a trajectory of the at least one recognized object, wherein the first type of object interactive relationship indicates an interactive relationship between the at least one recognized object and the preset object that does not appear in the current image frame (see at least abstract, ¶ [0136-0138], and Figs. 11-12 disclosing observing objects in an image frame, including certain objects and speculative objects associated with the certain object, and predicted their respective trajectories). It would be obvious to one of ordinary skill in the art before the effective filing of the present invention to incorporate the certain and speculative objects of Schmueli Friedland with the trajectory prediction system of Yan with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would help the vehicle more accurately determine the trajectory of both seen and unseen obstacles by retrieving information on the type of object approaching the vehicle, thereby better helping the vehicle avoid colliding the object. The combination of Yan and Schmueli Friedland does not disclose wherein determining the first trajectory of the first vehicle based on the predicted trajectory comprises: calculating a predicted collision time based on the predicted trajectory of the predicted object which is a virtual moving object not detected by a sensor of the first vehicle and based on an original target trajectory of the first vehicle, and adjusting the original target trajectory of the first vehicle based on the predicted collision time to generate the first trajectory of the first vehicle. However, Petroff suggests calculating a predicted collision time based on the predicted trajectory of the predicted object which is a virtual moving object not detected by a sensor of the first vehicle and based on an original target trajectory of the first vehicle, and adjusting the original target trajectory of the first vehicle based on the predicted collision time to generate the first trajectory of the first vehicle (see at least abstract and ¶ [0039-0045] and Figs 5A-7 disclosing an autonomous vehicle processor detecting an occlusion close to an intersection and generating a phantom obstacle that it predicts its speed to determine conflict zone in the intersection and timing to the conflict zone where the autonomous vehicle may collide with the phantom obstacle). While Petroff does not explicitly suggest its phantom obstacle is associated with a detected obstacle, it does preempt potential collisions from unseen obstacles that are occluded by the environment and would be likely present given proximity to an intersection. Therefore it would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the collision timing of Petroff with the combination of Yan and Schmueli Friedland with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would allow the system to gauge how much of an immediate threat the speculative object is by timing its approach and avoiding a collision before the predicted collision time. Claims 3, 5, 13, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Yan in view Schmueli Friedland and Petroff, as applied in claims 1 and 11 above, and in view of Rao (WO 2020113187 A1). Regarding claims 3 and 13, the combination of Yan, Schmueli Friedland, and Petroff does not explicitly disclose determining whether the preset interactive relationship information comprises a first type or a second type of object interactive relationship, and generating a determination result; and generating the predicted trajectory of the predicted object based on the determination result. However, Rao teaches determining whether the preset interactive relationship information comprises a first type or a second type of object interactive relationship, and generating a determination result (see at least ¶ [0064-0066] and [0069] and Figs. 4-5 and 9 where the vehicle identifies and classifies the type of object according to a database and predicting a trajectory based on the identification); and generating the predicted trajectory of the predicted object based on the determination result (see at least ¶ [0064-0066] and [0069] and Figs. 4-5 and 9 where the vehicle identifies and classifies the type of object according to a database and predicting a trajectory based on the identification). It would be obvious to one of ordinary skill in the art before the effective filing of the present invention to incorporate the object classification of Rao with the combination of Yan, Schmueli Friedland, and Petroff with a reasonable expectation of success because all inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would help the vehicle more accurately determine the trajectory of the obstacle by retrieving information on the type of object approaching the vehicle, thereby better helping the vehicle avoid colliding the object. Regarding claims 5 and 15, the combination of Yan, Schmueli Friedland, and Petroff does not explicitly disclose in response to determining that the preset interactive relationship information comprises the second type of object interactive relationship, determining whether the at least one object comprises a second vehicle; in response to determining that the at least one object comprises the second vehicle, determining whether the at least one object comprises a first object with the preset interactive relationship information with the second vehicle; in response to determining that the at least one object comprises the first object, setting the second vehicle as the predicted object; and calculating the predicted trajectory of the predicted object based on the preset interactive relationship information, a position of the first object relative to the predicted object, and a movement velocity of the predicted object. in response to determining that the preset interactive relationship information comprises the second type of object interactive relationship, determining whether the at least one object comprises a second vehicle; However, Rao teaches, in response to determining that the at least one object comprises the second vehicle, determining whether the at least one object comprises a first object with the preset interactive relationship information with the second vehicle see at least ¶ [0064-0066] and [0069] and Figs. 4-5 and 9 where the vehicle identifies and classifies the type of object according to a database and predicting a trajectory based on the identification, including different vehicles); in response to determining that the at least one object comprises the first object, setting the second vehicle as the predicted object (see at least ¶ [0064-0066] and [0069] and Figs. 4-5 and 9 where the identified object is set as the source of a predicted trajectory according to information retrieved from the database); and calculating the predicted trajectory of the predicted object based on the preset interactive relationship information, a position of the first object relative to the predicted object, and a movement velocity of the predicted object (see at least ¶ [0064-0066] and [0069] and Figs. 4-5 and 9 where the vehicle identifies and classifies the type of object according to a database and predicting a trajectory based on the identification). It would be obvious to one of ordinary skill in the art before the effective filing of the present invention to incorporate the object classification of Rao with the combination of Yan, Schmueli Friedland, and Petroff with a reasonable expectation of success because all inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would help the vehicle more accurately determine the trajectory of the obstacle by retrieving information on the type of object approaching the vehicle, thereby better helping the vehicle avoid colliding the object. Claims 8 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Yan in view Schmueli Friedland and Petroff, as applied in claims 1 and 11 above, and in view of Saiki et al. (US 20180178782 A1). Regarding claims 8 and 18, the combination of Yan, Schmueli Friedland, and Petroff does not explicitly disclose adjusting a driving direction of the first vehicle in the original target trajectory based on the predicted collision time to generate the first trajectory. However, Saiki teaches adjusting a driving direction of the first vehicle in the original target trajectory based on the predicted collision time to generate the first trajectory (see at least abstract and ¶ [0115-0117] where steering control of the vehicle to avoid a collision is determined in part by a predicted collision time). It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the steering control of Saiki with the combination of Yan, Schmueli Friedland, and Petroff with a reasonable expectation of success because both inventions are directed toward predicting the path of an object and/or obstacle for a vehicle and preventing collisions involving them. This would allow the system to gauge how much of an immediate threat the object is by timing its approach and avoiding a collision before the predicted collision time. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kim (US 20200242941 A1) discloses a driver assistance system (DAS) including a controller including a processor to process image and radar data. The controller identifies an object based on the image and radar data, determines a risk for the object by determining collision possibility with the identified object, and determines driving position of the vehicle within a driving lane based on the risk for the object. Ansari et al. (US 10156850 B1) discloses systems and methods for determining object motion and controlling autonomous vehicles, including obtaining data indicative of state(s) of a first object and a second object within an environment of an autonomous vehicle. The operations can include determining a first predicted motion trajectory of the first object; determining a second predicted motion trajectory of the second object based at least in part on the state data and the first predicted motion trajectory of the first object; and determining a motion plan for the autonomous vehicle based at least in part on the second predicted motion trajectory of the second object. Körner et al. (US 20190196472 A1) discloses a system comprising a first data processing module that performs clustering of multiple trajectories based on associated data, a database for retrievably storing the results of the clustering, and interfaces for receiving a request for the transmission of a trajectory and for corresponding transmission of the requested trajectory. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JARED C BEAN whose telephone number is (571)272-5255. The examiner can normally be reached 7:30AM - 5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne M Antonucci can be reached on 313-446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /J.C.B./Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Dec 28, 2021
Application Filed
Feb 21, 2024
Non-Final Rejection — §103
May 24, 2024
Response Filed
Jul 22, 2024
Final Rejection — §103
Oct 08, 2024
Response after Non-Final Action
Oct 08, 2024
Request for Continued Examination
Oct 29, 2024
Non-Final Rejection — §103
Jan 14, 2025
Interview Requested
Jan 22, 2025
Applicant Interview (Telephonic)
Jan 22, 2025
Examiner Interview Summary
Feb 07, 2025
Response Filed
Apr 01, 2025
Final Rejection — §103
Jul 08, 2025
Request for Continued Examination
Jul 12, 2025
Response after Non-Final Action
Jul 15, 2025
Non-Final Rejection — §103
Sep 19, 2025
Interview Requested
Sep 29, 2025
Examiner Interview Summary
Sep 29, 2025
Applicant Interview (Telephonic)
Oct 22, 2025
Response Filed
Jan 22, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589898
STEERABLE DEPENDENT VEHICLE FOR UNMANNED AERIAL VEHICLES
2y 5m to grant Granted Mar 31, 2026
Patent 12585269
FLEET MIGRATION MANAGEMENT SYSTEMS AND METHODS
2y 5m to grant Granted Mar 24, 2026
Patent 12552559
ANTENNA MEASUREMENT USING UNMANNED AERIAL VEHICLES
2y 5m to grant Granted Feb 17, 2026
Patent 12530035
TRAVEL CONTROL SYSTEM, CONTROL METHOD, AND CONTROL DEVICE
2y 5m to grant Granted Jan 20, 2026
Patent 12495734
THREE DIMENSIONAL GRID MAP FOR CROSS MEMBER LOCATION AND AVOIDANCE DURING AN UNLOADING OPERATION
2y 5m to grant Granted Dec 16, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
63%
Grant Probability
99%
With Interview (+38.7%)
2y 12m
Median Time to Grant
High
PTA Risk
Based on 118 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month