Prosecution Insights
Last updated: April 19, 2026
Application No. 18/982,468

SYSTEMS AND METHODS FOR GENERATING DATA DESCRIBING PHYSICAL SURROUNDINGS OF A VEHICLE

Non-Final OA §103§112§DP
Filed
Dec 16, 2024
Examiner
OH, HARRY Y
Art Unit
3657
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Smartdrive Systems Inc.
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
584 granted / 684 resolved
+33.4% vs TC avg
Strong +18% interview lift
Without
With
+18.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
23 currently pending
Career history
707
Total Applications
across all art units

Statute-Specific Performance

§101
6.6%
-33.4% vs TC avg
§103
37.0%
-3.0% vs TC avg
§102
16.2%
-23.8% vs TC avg
§112
31.2%
-8.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 684 resolved cases

Office Action

§103 §112 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority The applicant’s claim to priority of 16/025,852 on 7/2/2018 is acknowledged. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO internet Web site contains terminal disclaimer forms which may be used. Please visit http://www.uspto.gov/forms/. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. The claims of the instant application are rejected on the ground of nonstatutory double patenting as being unpatentable over the claims of US Patents 11830365 and 12170023. Although the claims at issue are not identical, they are not patentably distinct from each other because the scope of the claims in the instant application are encompassed by the claims of US Patents 11830365 and 12170023 as mapped below: Instant Application 18982468 US Patent 12170023 US Patent 11830365 1, 9 1, 9 1, 9 2, 10 2, 10 2, 10 3, 11 3, 11 3, 11 4 4 4 5 5 5 6 6 6 7, 12 7, 12 7, 12 8 8 8 Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 3 and 11 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.Regarding claim 3 (and similarly 11), the term “the event report” lacks antecedent basis, rendering the term and claim unclear. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-12 are rejected under 35 U.S.C. 103 as being unpatentable over Binion (US Publication No. 2015/0254781) in view of Shenoy et al. (US 20180001899 hereinafter Shenoy). Regarding claim 1 (and similarly 9), Binion teaches a system configured to generate simulation scenarios for simulating simulated vehicles, wherein the simulation scenarios mimic physical surroundings of a first vehicle (See at least: [0020], “system 10”; [0047], “The virtual model can be used to visually re-create an accident or other event involving vehicle 12”; [0048], “the model generation unit 82 may process the external sensor data to render a three-dimensional temporal model of the environment of vehicle 12”) that carries a set of sensors including one or more image sensors, wherein the set of sensors is configured to generate output signals conveying information related to one or both of the physical surroundings of the first vehicle and/or operating conditions of the first vehicle (See at least: [0021], “a first external sensor 30 and a second external sensor 32…a still image or video camera device, a lidar (laser remote sensing) device”); [0022], “Each of external sensors 30, 32 generates data, or analog information, that is indicative of the sensed external environment”; [0023],” onboard system 14 also includes hardware, firmware and/or software subsystems that monitor and/or control various operational parameters of vehicle 12”);), the system configured to couple with a fleet of vehicles including the first vehicle, the system comprising (See at least: [0102], “each of vehicles 302A- 302C”; [0020], “collecting and processing data obtained from a vehicle 12”): one or more processors configured via machine-readable instructions to: derive, in an ongoing manner, from the output signals, the physical surroundings in which the first vehicle is operating, wherein the physical surroundings include (See at least: [0031], “one or more processors; [0132], “computer-readable instructions”; [0026] “The data collection unit 50 may collect the data and/or analog signals substantially in real time”; [0021], “Each of external sensors 30, 32 is a device configured to sense an environment external to vehicle”): (i) a first speed parameter representing a first speed of the first vehicle traveling in a first lane (See at least: [0016], “monitor and/or control operational parameters of the vehicle (e.g., speed,” Fig. 2 and Fig. 3), (ii) a second speed parameter representing a second speed of a second vehicle traveling )) (iii) a first distance parameter representing a first distance between the first vehicle and the second vehicle; detect a vehicle event involving the first vehicle, wherein detection is based on the output signals (See at least: [0043], “the average following distance between vehicle 12 and vehicles ahead of vehicle 12”; [0104], “with only a small forward distance to another vehicle (e.g., as detected by a combination of external sensor data and speed data), etc.”); and automatically create, by a simulation component, based on information captured by the one or more image sensors at the time of the detected vehicle event, and further based on the output signals, a simulation scenario for simulating a simulated vehicle by recreating timelines for the first speed parameter, the second speed parameter, and the first distance parameter in the simulation scenario (See at least: [0047], “model generation unit 82”), based on the event report and the generated output signals (see at least para.[0047], “The model generation unit 82 is configured to process data from external sensor 30 and/or external sensor 32 in order to generate a virtual model”; Fig. 3; [0061], “the virtual model 150 provides an animated re-creation of the accident”; [0072], “As another example, in some embodiments, the virtual model is generated using data from external sensors of at least two different vehicles (e.g., in order to re-create an accident or other event involving both vehicles”; [0017], “time stamps”), wherein the simulation scenario includes surroundings for the simulated vehicle that mimic the physical surroundings of the first vehicle at the time of the detected vehicle event such that the surroundings included in the simulation scenario include (see at least para.[0048], “the model generation unit 82 may process the external sensor data to render a three-dimensional temporal model of the environment of vehicle 12, which may depict the conditions, over time, of an accident that involved or occurred near vehicle 12”):, wherein the simulation scenario includes surroundings for the simulated vehicle that mimic the physical surroundings of the first vehicle at the time of the detected vehicle event such that the surroundings included in the simulation scenario include a simulated second vehicle travelling at a simulated second speed that corresponds to the second speed parameter, wherein the simulated second vehicle has a simulated first distance from the simulated vehicle that corresponds to the first distance parameter (See at least: [0048], “the model generation unit 82 may process the external sensor data to render a three-dimensional temporal model of the environment of vehicle 12, which may depict the conditions, over time, of an accident that involved or occurred near vehicle 12”; Fig. 3 and para.[0061], “the virtual model 150 provides an animated re-creation of the accident”.; [0072], “As another example, in some embodiments, the virtual model is generated using data from external sensors of at least two different vehicles (e.g., in order to re-create an accident or other event involving both vehicles)”, which anticipates simulating all of the information and events associated with the second vehicle); [0061], “the virtual model 150 provides an animated re-creation of the accident”), but fails to explicitly teach a second vehicle, in front of the first vehicle, in the first lane. However, Shenoy teaches a second vehicle, in front of the first vehicle, in the first lane (See at least: [0094] In an example embodiment, tail gating events (contextual) may be detected where vehicles in front of the vehicle being driven are detected using machine learning, as described earlier. Optionally, lanes can be detected (if lane markers are available) and only vehicles in the same lane or the immediate neighbouring lanes can be considered for analysis. Driving too close to the vehicle in front can mean that the driver is not maintaining a safe braking distance in the event that a vehicle in front stops suddenly. The video data from the dashboard camera is used to obtain samples of vehicles seen from behind, and a machine learning algorithm is used to learn the model. During driving, the model is used to detect a vehicle. The time to collision for the vehicle in front can be computed as described in some methods in the art. A tailgating offense is said to be detected when the vehicle in front is at a distance less than the safe braking distance.). Therefore, it would have been obvious to one of ordinary skill in the art at the time of the invention to modify Binion in view of Shenoy to teach a second vehicle, in front of the first vehicle, in the first lane so that the generated simulation scenarios can explicitly include a second vehicle in front of a first vehicle in the same lane to capture driving events regarding rear-end or tailgating type conditions. Regarding claim 2 (and similarly 10), Binion teaches wherein the physical surroundings further include: one or more roadway parameters representing one or more characteristics of the roadway on which the first vehicle is operating (See at least: [0063], “street conditions (and/or any other environmental conditions) may be determined by model generation unit 82 based on data from external sensor 30 and/or external sensor 32, and/or based on data received vehicle 12 receives from external sources ( e.g., via V2X communications)”).Regarding claim 3 (and similarly 11), Binion teaches wherein the one or more processors are further configured via machine-readable instructions to determine one or both of a make and/or a model of the second vehicle, and wherein the event report further includes information representing one or both of the make and/or the model of the second vehicle (See at least: [0054], “identifying characteristics such as state where the vehicle is registered ( e.g., as printed on the license plate), or the color, year, make and/or model of the vehicle”); [0069], “The method 200 then retrieves the sensor data from the memory, and uses the sensor data to generate a virtual model of an event involving at least one vehicle (block 206)”).) Regarding claim 4 (and similarly 12), Binion teaches wherein the one or more processors are further configured via machine-readable instructions to determine one or more external conditions, wherein the one or more external conditions are related to one or more of local weather, local temperature, local precipitation, local visibility, and/or local ambient light, and wherein the simulation scenario is further based on information representing the one or more external conditions (See at least: [0063], “The weather/visibility…(and/or any other environmental conditions) may be determined by model generation unit 82 based on data from external sensor 30 and/or external sensor 32, and/or based on data received vehicle 12 receives from external sources”), and wherein the stored information includes information representing the one or more external conditions (see at least para.[0069], “The method 200 then retrieves the sensor data from the memory, and uses the sensor data to generate a virtual model of an event involving at least one vehicle (block 206)”). Regarding claim 5, Binion teaches wherein the one or more processors are further configured via machine-readable instructions to determine one or more external conditions, wherein the one or more external conditions are related to road surface, and wherein the simulation scenario is further based on information representing the one or more external conditions (See at least: [0063], “The weather/visibility…(and/or any other environmental conditions) may be determined by model generation unit 82 based on data from external sensor 30 and/or external sensor 32, and/or based on data received vehicle 12 receives from external sources”), and wherein the stored information includes information representing the one or more external conditions (see at least para.[0069], “The method 200 then retrieves the sensor data from the memory, and uses the sensor data to generate a virtual model of an event involving at least one vehicle (block 206)”). Regarding claim 6, teaches wherein the one or more processors are further configured via machine-readable instructions to determine one or more external conditions, wherein the one or more external conditions are related to local weather conditions, and wherein the simulation scenario is further based on information representing the one or more external conditions (See at least: [0063], “The weather/visibility…(and/or any other environmental conditions) may be determined by model generation unit 82 based on data from external sensor 30 and/or external sensor 32, and/or based on data received vehicle 12 receives from external sources”), and wherein the stored information includes information representing the one or more external conditions (see at least para.[0069], “The method 200 then retrieves the sensor data from the memory, and uses the sensor data to generate a virtual model of an event involving at least one vehicle (block 206)”). Regarding claim 7, Binion teaches wherein the first vehicle includes a set of resources carried by the first vehicle, wherein the set of resources includes a first transceiver configured to transfer information from the first vehicle to a remote computing server (See at least: [0030], “interfaces and antennas, that are configured to receive wireless signals”. Also, see at least para.[0032], “The interface 60 may include a transmitter and one or more antennas”, where the receipt and transmission of data anticipates a transceiver”; [0037] “computer system 16 is an electronic processing system capable of performing various functions, and includes an interface 62 configured to receive data from the onboard system 14 of vehicle 12 via network 20”), wherein the second vehicle includes a second set of resources carried by the second vehicle, wherein the second set of resources includes a second transceiver configured to transfer information from the second vehicle to the remote computing server (See at least: [0030] “interfaces and antennas, that are configured to receive wireless signals”. [0032], “The interface 60 may include a transmitter and one or more antennas”, where the receipt and transmission of data anticipates a transceiver; [0104], “computer system 16 stores the vehicle data from vehicles 302A-302C in a memory 330”), wherein the one or more processors are further configured via machine-readable instructions to stream information, using the first transceiver, to the remote computing server, wherein the streamed information includes a representation of the output signals, wherein the system further includes: the remote computing server including one or more physical processors configured by machine-readable instructions to: receive the streamed information from the first transceiver of the first vehicle; and receive a second stream of information from the second transceiver of the second vehicle, wherein the second stream of information includes a representation of sensor signals generated by a second set of sensors carried by the second vehicle (See at least: [0068] “the sensor data is received at a server remote from the vehicle via a wireless communication network”); [0140], “processor such as the processing unit 520”; [0132], “computer-readable instructions”; [0104], “computer system 16 stores the vehicle data from vehicles 302A-302C in a memory 330”). Regarding claim 8, teaches wherein the one or more processors are further configured via machine-readable instructions to stream information, using a first transceiver carried by the first vehicle, to a remote computing server, wherein the streamed information includes a representation of an event report describing the detected vehicle event, and wherein the first transceiver is configured to transfer the information to the remote computing server See at least: [0030] “interfaces and antennas, that are configured to receive wireless signals”; [0032], “The interface 60 may include a transmitter and one or more antennas”, where the receipt and transmission of data anticipates a transceiver; [0068], “the sensor data is received at a server remote from the vehicle via a wireless communication network”); [0072-0073], Binion teaches the gathering sensor data from a first vehicle, wherein the gathered sensor data represents the events associated with the vehicle and anticipates the “event report” of the first vehicle. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Harry Oh whose telephone number is (571)270-5912. The examiner can normally be reached on Monday-Thursday, 9:00-3:00. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on (571) 270-3976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HARRY Y OH/Primary Examiner, Art Unit 3657
Read full office action

Prosecution Timeline

Dec 16, 2024
Application Filed
Feb 16, 2026
Non-Final Rejection — §103, §112, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600028
INDUSTRIAL ROBOT COMPRISING AN AXLE DRIVE WITH A COMPACT CONSTRUCTION
2y 5m to grant Granted Apr 14, 2026
Patent 12589503
ROBOT CONTROL APPARATUS, ROBOT CONTROL SYSTEM, AND METHOD FOR CONTROLLING ROBOT
2y 5m to grant Granted Mar 31, 2026
Patent 12589498
Deployment System for Additive Manufacturing Robot Fleet
2y 5m to grant Granted Mar 31, 2026
Patent 12589486
ROBOT AND ROBOT-CONTROLLING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12576541
SURFACE FINISH QUALITY EVALUATION SYSTEM AND METHOD
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
99%
With Interview (+18.3%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 684 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month