Prosecution Insights
Last updated: April 18, 2026
Application No. 18/258,483

SYSTEMS AND METHODS FOR LATENCY-TOLERANT ASSISTANCE OF AUTONOMOUS VEHICLES

Non-Final OA §102
Filed
Jun 20, 2023
Examiner
REFAI, RAMSEY
Art Unit
3664
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Robocars Inc.
OA Round
1 (Non-Final)
50%
Grant Probability
Moderate
1-2
OA Rounds
3y 5m
To Grant
61%
With Interview

Examiner Intelligence

Grants 50% of resolved cases
50%
Career Allow Rate
322 granted / 647 resolved
-2.2% vs TC avg
Moderate +12% lift
Without
With
+11.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
20 currently pending
Career history
667
Total Applications
across all art units

Statute-Specific Performance

§101
28.1%
-11.9% vs TC avg
§103
26.6%
-13.4% vs TC avg
§102
25.7%
-14.3% vs TC avg
§112
14.8%
-25.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 647 resolved cases

Office Action

§102
DETAILED ACTION Responsive to the Response to the Election/Restriction filed January 29, 2026. Applicant’s election without traverse of Group I (1-13) in the reply is acknowledged. Claims 14-38 are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected inventions, there being no allowable generic or linking claim. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale , or otherwise available to the public before the effective filing date of the claimed invention. Claim s 1-13 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by Fairfield et al (US 2016/0370801). As per claim 1 , Fairfield et al teach a computer-implemented method for receiving latency-tolerant assistance of an autonomous vehicle (AV), comprising: receiving sensory data from the AV; detecting from the sensory data a trigger for assistance; generating a request for assistance comprising at least a portion of the sensory data of the AV (see at least paragraphs [0005-0006; sensor data from vehicle, request generated with sensory data) ; receiving, in response to the request for assistance, an operator command for responding to the trigger for assistance ; and initiating one or more actuation commands via an actuation subsystem of the AV in re sponse to the received operator command (see at least paragraphs [0029, 0085, 0096]; remote operator responds to the request by sending instructions/commands to the requesting vehicle) . As per claim 2 , Fairfield et al teach transmitting the request for assistance to a remote operator station, presenting the request for assistance to an occupant of the AV; or a combination thereof (see at least paragraphs [0025, 0088, 0098]; request sent to a remote operator or to vehicle passenger) . As per claim 3 , Fairfield et al teach presenting the request for assistance via an output interface comprising a display console, a speaker, tactile feedback, or a combination thereof, on the AV (see at least paragraphs [0006, 0096, 0098]) . As per claim 4, Fairfield et al teach wherein receiving the operator command comprises receiving input via an input interface comprising a wireless communications interface, a touchscreen, a switch, a button, a knob, a keyboard, a computer mouse, a drawing pad, a camera, a microphone, or a combination thereof on the AV (see at least paragraphs [0096, 0097, 0105]) . As per claim 5, Fairfield et al teach wherein detecting the trigger for assistance further comprises: identifying one or more objects external to the AV from the sensory data; determining a classification for each of the one or more objects via a plurality of characteristics of the object; and determining that one or more objects creates an assistance event for the AV based at least on the classification type of the object (see at least figs 4A-4B, paragraphs [0028, 0077, 0080-0082]) . As per claim 6, Fairfield et al teach wherein detecting the trigger for assistance further comprises: determining a distance and direction of the object with respect to the AV, a direction of travel for the AV, a speed of travel for the AV, or a combination thereof; and wherein the determining the object creates the assistance event for the AV is further based on the distance and direction of the object with respect to the AV, the direction of travel for the AV, the speed of travel for the AV, or the combination thereof (see at least paragraphs [0043, 0056, 0068, 0104]) . As per claim 7 , Fairfield et al teach wherein the sensory data comprises video data received from a plurality of cameras of the AV, laser data received from a plurality of lidar sensors of the AV, radar targets received from a plurality of radar sensors of the AV, ultrasound objects detected from a plurality of ultrasonic sensors of the AV, audible sounds received from a plurality of microphones of the AV, vehicle dynamics data from a plurality of inertial management units, processed outputs from the sensory data of the AV; or a combination thereof (see at least paragraphs [0006, 0025, 0059]) . As per claim 8 , Fairfield et al teach wherein the sensor data comprises a front view from the AV, a side view from the AV, a rear view from the AV, or a combination thereof (see at least paragraphs [0056, 0059, 0061]) . As per claim 9, Fairfield et al teach wherein detecting the trigger for assistance further comprises: identifying a passage of time past a time threshold with no progress in a position of the AV; and determining the passage of time creates an assistance event for the AV based at least on the duration of the time interval, a driving context of the AV, or a combination thereof (see at least paragraphs [0078, 0082]) As per claim 10, Fairfield et al teach wherein detecting the trigger for assistance further comprises: determining a distance and direction of the object with respect to the AV, a direction of travel for the AV, a speed of travel for the AV, or a combination thereof; and wherein the determining the object creates the assistance event for the AV is further based on the distance and direction of the object with respect to the AV, the direction of travel for the AV, the speed of travel for the AV, or the combination thereof (see at least paragraphs [0043, 0056, 0068, 0104]) . As per claim 11, Fairfield et al teach wherein the one or more operator commands received by the AV are latency-tolerant (see at least paragraphs [0097, 0100]) and comprise an increase in AV speed (or speed limit), a decrease in AV speed (or speed limit), maintaining AV speed, instructing the AV to drive around one or more obstacles, instructing the AV to drive over or through one or more obstacles, a continuation of the AV in its current state, maintaining a stationary status until further notice, a gear selection of the AV, a horn initiation, an initiation of vehicle flashers, an emergency alert initiation, an unlocking or locking of a door of the AV, opening or closing of a window of the AV, a headlight initiation of the AV, a direction change of the AV, a route change of the AV, changes to a map used by the AV, a turn instruction for the AV, a lane change of the AV, a re-routing of the AV, a lateral offset in the travel direction of the AV, driving on a road shoulder, driving off-road, following a driving path specified in the operator command, driving over a shoulder, following a specified set of lanemarkers , yielding to other vehicles at an intersection, performing a zipper merge at a merge point, instructing the AV to take its turn, instructing the AV to merge, positioning the AV over to a shoulder of a road, stopping the AV, yielding to an emergency vehicle, requiring manual takeover of the AV; information to be presented to an occupant of the AV, or a combination thereof (see at least paragraphs [0029, 0097, 0099-0100]) As per claim 12 , Fairfield et al teach storing the trigger for assistance, parameters associated with the detecting the trigger for assistance, the one or more actuation commands, or a combination thereof, in a remote database; receiving additional sensory data from the AV; and detecting another trigger for assistance from at least in part the additional sensory data and the stored trigger for assistance, parameters associated with the detecting the trigger for assistance, the one or more actuation commands, or the combination thereof (see at least paragraphs [0074, 0089]) . Claims 13 contains similar limitations as claim 1 above and therefore is rejected under similar rationale. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT Ramsey Refai whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (313)446-4867 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT M-F 9am-5pm EST . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Kito Robinson can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT (571) 270-3921 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. FILLIN "Examiner Stamp" \* MERGEFORMAT RAMSEY REFAI Primary Examiner Art Unit 3664 /RAMSEY REFAI/ Primary Examiner, Art Unit 3664
Read full office action

Prosecution Timeline

Jun 20, 2023
Application Filed
Mar 31, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602642
INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD DETERMINING CONTROL METHOD FOR A PLURALITY OF MOVABLE APPRATUSES UTILIZING A FACILITY
2y 5m to grant Granted Apr 14, 2026
Patent 12596384
SYSTEM AND METHOD FOR MAPPING OBSTRUCTIONS IN A WORK AREA TO CORRESPONDING LOCATIONS
2y 5m to grant Granted Apr 07, 2026
Patent 12591842
Node-enabled Logistics Receptacle Apparatus, Systems, and Methods with a Deployable Storage Element for Receiving and Temporarily Maintaining a Delivery Item
2y 5m to grant Granted Mar 31, 2026
Patent 12582038
SYSTEM AND METHOD FOR CONTROLLING THE OPERATION OF AN AGRICULTURAL HARVESTER
2y 5m to grant Granted Mar 24, 2026
Patent 12559141
LOGISTICS SYSTEM COMPRISING A TRUCK AND A TRAILER AND RELATED METHOD
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
50%
Grant Probability
61%
With Interview (+11.6%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 647 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month