Prosecution Insights
Last updated: April 19, 2026
Application No. 18/865,791

SYSTEMS AND METHODS FOR COLLISION DETECTION AND CLASSIFICATION

Non-Final OA §101§103
Filed
Nov 14, 2024
Examiner
MUNION, JAMES E
Art Unit
2688
Tech Center
2600 — Communications
Assignee
Netradyne Inc.
OA Round
1 (Non-Final)
76%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 76% — above average
76%
Career Allow Rate
103 granted / 135 resolved
+14.3% vs TC avg
Strong +24% interview lift
Without
With
+23.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
30 currently pending
Career history
165
Total Applications
across all art units

Statute-Specific Performance

§101
5.6%
-34.4% vs TC avg
§103
52.2%
+12.2% vs TC avg
§102
29.6%
-10.4% vs TC avg
§112
9.8%
-30.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 135 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 24 is rejected under 35 U.S.C. 101 because the claimed invention is directed to non-statutory subject matter. The specification at para. [0094] gave examples of processor-readable and/or computer-readable storage medium but does not explicitly limit the storage medium to be non-transitory. Under the broadest reasonable interpretation (BRI), a computer-readable storage media covers forms of transitory propagating signals or a data structure per se, and therefore would not be patent-eligible. Transitory media does not fit within recognized categories of statutory subject matter (See MPEP 2106-2106.01). Therefore, claim 24 as properly read in light of the disclosure encompasses non-statutory subject matter, i.e., transitory medium/signal, thus the claim as a whole is non-statutory. Applicant may exclude such transitory type medium portion from the claims by, e.g. amending the claims to recite "A non-transitory computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the steps of the method of...” which would make the claim statutory as shown by the Courts. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 6-8, 11-12, 14-15, 19-21 and 23-24 are rejected under 35 U.S.C. 103 as being unpatentable over Barfield (US Patent No. 20160094964 A1), in view of Petersen (US Patent No. 11862022 B2). In re claim 1, Barfield teaches A method (300) of detecting a vehicle collision (Abstract: “Vehicle collisions may be automatically detected and reported…”), comprising: receiving, by at least one processor, vehicle class data of a vehicle (110), and inertial sensor data and speed from at least one sensor (135) in a housing inside a cabin of the vehicle (110) (Para [0020]: “Vehicle 210 may include telematics device 212, such as an aftermarket telematics device installed via an On-Board Diagnostics (OBD) port or a telematics device that is installed during manufacture of vehicle 210… Telematics device 212 may generally operate to sense environmental data (e.g., via accelerometers or other sensors) and/or receive data from the OBD system of vehicle 210, and use the sensed/received data to evaluate a model that is trained to output indications of vehicle collisions. An example implementation of telematics device 212 is described in more detail below with reference to FIG. 3.”, para [0057]: “Although the above-discussion of extracting features, to obtain feature vectors, included a number of features defined based on acceleration, sensor data from other sensors, or other types of data, may also be included in the feature vectors. For example, the feature vectors may include features extracted from measurements obtained from an audio sensor, a gyroscope, a barometer, a speedometer, a compass, and/or other sensors. The feature vectors may additionally include features extracted or derived from other data.”, para [0060]: “Detection-of-collision determinations may alternatively or additionally include generation or detection of other information relating to a potential collision of a given vehicle, such as: the direction of the collision (e.g., the direction of the other vehicle relative to the given vehicle), the speed of the given vehicle before the collision, or other information that may be useful in evaluating the potential severity of the collision.” and para [0062]: “In some implementations, different collision detection models 355 may be trained for different types of vehicles. For example, separate collision detection models may be determined for different classes of vehicles (trucks, compact cars, etc.) or for different makes/models/years of the vehicles. Alternatively, an indication of the type of vehicle may be provided as an input feature to a model, which may allow the model to generate the classification result based on the type of vehicle.”); detecting (302), by the at least one processor, a vehicle event based on the inertial sensor data and the speed (SEE BELOW); processing (304), by the at least one processor, the inertial sensor data, the speed, and vehicle class data of the vehicle with an initial classifier to generate an indication that the detected vehicle event is a collision event or a non-collision event (Para [0060]: “As previously mentioned, in some implementations, instead of outputting a binary (collision or no collision) output, the collision detection model may be trained to generate an output indicating multiple (e.g., three or more) potential collision states, such as “no collision,” “non-severe collision,” or “severe” collision. In general, classification models may seek to minimize an overall cost function. By assigning higher costs to higher severity crashes, the cost function operates to penalize the model more for misclassifying severe crashes. In this way, the collision detection model may potentially be tuned to be relatively accurate for high severity crashes, while retaining high false positive rejection characteristics. Detection-of-collision determinations may alternatively or additionally include generation or detection of other information relating to a potential collision of a given vehicle, such as: the direction of the collision (e.g., the direction of the other vehicle relative to the given vehicle), the speed of the given vehicle before the collision, or other information that may be useful in evaluating the potential severity of the collision.” and para [0061]: “For example, an operator may specify that the trained collision detection model must classify 100% of the confirmed vehicle collisions as collisions and have no more than a 10% false positive rate before the trained collision detection model will be used in real-world situations.”), and generating, by the at least one processor, a notification based on the event subclass (Para [0018]: “In response to the detection of a potential vehicle collision, the telematics device may alert a call center, such as by signaling the call center via a wireless network, such as a cellular wireless network (at 1.2, “Alert Call Center”). An operator, at the call center, may determine how to best handle the alert. For example, the operator may speak to a driver of the vehicle ask the driver whether the driver needs assistance. In this example, assume that the call center operator determines that emergency response personnel are needed (e.g., the driver may confirm that there has been a non-trivial vehicle collision and/or the driver may fail to respond to voice prompts from the call center operator). In this case, the call center operator may communicate with an emergency response center (e.g., a 911 response center, a police, fire, or ambulance team local to the vehicle collision, etc.) to provide information relevant to the emergency response personnel (e.g., the location of the vehicle collision, the potential severity of the vehicle collision, etc.) (at 1.3, “Dispatch Emergency Response Personnel”).” and para [0077]: “Output component 1150 may include a mechanism that outputs information to the operator, such as a display, a speaker, one or more light emitting diodes (LEDs), etc.”). Barfield fails to teach wherein the indication comprises an initial collision classification confidence value; in response to generating an indication that the detected vehicle event is a collision event, comparing (310), by the at least one processor, the initial collision classification confidence value with a predefined confidence threshold; in response to a determination that the initial collision classification confidence value is below a predefined confidence threshold, determining (320) by the at least one processor whether a location of the vehicle associated with the detected event is within a distance of a non- collision cluster point; in response to a determination that the location of the vehicle (110) is not within a distance of a non-collision cluster point, classifying (314), by the at least one processor, an event subclass of the collision event based on the inertial sensor data, the speed, and the vehicle class data of the vehicle. However, Petersen teaches wherein the indication comprises an initial collision classification confidence value (SEE BELOW); in response to generating an indication that the detected vehicle event is a collision event, comparing (310), by the at least one processor, the initial collision classification confidence value with a predefined confidence threshold (Col 6, lines 17-32: “In response to obtaining information regarding a potential collision, data describing the vehicle during a time period before and/or after a time associated with the indication of the potential collision may be analyzed to determine a likelihood that the potential collision is a non-collision event. In some examples, a likelihood may be represented by a numeric value that stands for a probability that the potential collision is a non-collision event, e.g., 10%, 20%, 35%, etc. In other examples, a likelihood may be a binary indicator, e.g., collision or non-collision, or 0% and 100%. In other examples, a likelihood may be represented by multiple levels, e.g., “yes,” “no,” “most likely,” “likely,” “less likely,” or “unlikely” etc. If the likelihood indicates that the potential collision is not a non-collision, then the system may trigger one or more actions responding to the potential collision.”); in response to a determination that the initial collision classification confidence value is below a predefined confidence threshold (Col 40, lines 33-58: “These probabilities may be used to determine whether a collision occurred and/or characteristics of the collision. For example, a class that has a highest probability may be selected as the accurate answer, and the collision information for that class (whether a collision occurred and/or characteristics of such a collision) may be chosen as the likely correct descriptor of the collision. As another example, the probabilities may be compared to a threshold to determine whether any of the probabilities for any of the classes are above the threshold. If so, all of the classes for which a probability are above a threshold may be reported as potential matches for the potential collision, for a user to review each of the potential matches and the collision information for each of the potential matches. As another example, the probabilities may be compared to determine whether one or more of the probabilities differ from others by more than a threshold amount, such that one or more could be determined to be potential correct matches whereas others are, as compared to those one or more, less likely to be correct. Those one or more that stand out from the others may then be reported as potential matches for the potential collision, for a user to review each of the potential matches and the collision information for each of the potential matches. Embodiments are not limited to any particular manner of analyzing probabilities and selecting one or more potential correct matches from the probabilities.”), determining (320) by the at least one processor whether a location of the vehicle associated with the detected event is within a distance of a non- collision cluster point (SEE Col 21, lines 23-42); in response to a determination that the location of the vehicle (110) is not within a distance of a non-collision cluster point (Cols 49-50, lines 57-67 and 1-5: “Data for different forms of collisions is illustrated in FIG. 13D, formatted as a scatterplot based on the x and y accelerations with color of the dot indicating accident type and size of the dot indicating speed. A collection of bright blue dots is spread throughout the middle of the graph, cluster in a band around the 0 value on the y-axis but spread throughout the x axis. These are dots associated with different forward-backward collisions, confirming again that a forward-backward collision has little change in the y direction (right-left). The green dots, however, show varying values for change in the y direction (right-left) and are all associated with negative change (deceleration) in the x direction (forward-backward). These are points associated with angled impacts, where there is substantial change in the right-left direction and the vehicle slows down substantially in the forward-backward direction.”), classifying (314), by the at least one processor, an event subclass of the collision event based on the inertial sensor data, the speed, and the vehicle class data of the vehicle (Col 50, lines 6-33: “As discussed above, attempting to detect or characterize a collision using only acceleration data from a single point during an accident is unreliable. Using techniques described herein, which obtain longitudinal movement information for a time period surrounding an event associated with a potential collision, or other information obtained for a vehicle for that time period, may be highly reliable. FIG. 13E shows a chart demonstrating this high reliability, with low false positions or false negatives. In the chart, a “1” category is not a collision, a “2” category is a forward-backward collision, and a “3” category is an angled impact collision. The graph shows the predictions generated by a system trained using the data of FIG. 13D, compared with the actual scenario. As shown by the chart, the trained classifier will accurately determine, in 98.7% of cases, that no collision occurred when there was, in reality, no collision. The classifier never identifies data related to a not-collision as reflecting an angled-impact collision, and only 1.3% of the time, incorrectly identifies the not-collision as being a forward-backward collision. Similarly, the trained classifier properly identifies, in 95.6% of cases, that a forward-backward collision is a forward-backward collision, with the remaining cases limited to misidentifying the scenario as a not-collision. Lastly, the trained classifier correctly concludes, in 98.2% of cases, that an angled-impact collision is an angled-impact collision, with the misidentifications evenly spread, in less than 1% of cases, between not-collisions and forward-backward collisions.”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Barfield to incorporate the teachings of Petersen to provide wherein the indication comprises an initial collision classification confidence value; in response to generating an indication that the detected vehicle event is a collision event, comparing (310), by the at least one processor, the initial collision classification confidence value with a predefined confidence threshold; in response to a determination that the initial collision classification confidence value is below a predefined confidence threshold, determining (320) by the at least one processor whether a location of the vehicle associated with the detected event is within a distance of a non- collision cluster point; in response to a determination that the location of the vehicle (110) is not within a distance of a non-collision cluster point, classifying (314), by the at least one processor, an event subclass of the collision event based on the inertial sensor data, the speed, and the vehicle class data of the vehicle with the AUTOMATIC VEHICLE CRASH DETECTION USING ONBOARD DEVICES of Barfield. Doing so enables analyzing data to determine a likelihood that the potential collision is a non-collision event, where probabilities may be compared to determine whether one or more of the probabilities differ from others by more than a threshold amount, such that one or more could be determined to be potential correct matches whereas others are, as compared to those one or more, less likely to be correct, as recognized by Petersen (Col 6, lines 17-32 and Col 40, lines 33-58). System claim 14 and computer-readable medium claim 24 are rejected for the same reasons as method claim 1 for having similar limitations and being similar in scope. In re claim 2, Barfield and Petersen teach all of the limitations of claim 1 stated above where Barfield further teaches wherein detecting (302) the vehicle event comprises identifying, by the at least one processor, a time period during which the vehicle event occurred based on the inertial sensor data (Para [0039]: “Process 400 may include collecting data relating to vehicle collisions (block 410). The type of collected data may be similar to that illustrated in FIG. 2 as vehicle crash and user data 222, external crash sensor data 224, call center data 226, and municipal data 228. As mentioned, the data may include sensor data, such as sampled time-series data relating to the real-world acceleration of vehicles in a collision, as well as other data, such as a call center data 226 or municipal data 228 that may be used to definitively indicate whether a crash occurred and potentially indicate the severity of the crash. In one implementation, whenever a telematics device 212 signals the occurrence of a collision, the indication of the collision, as well as sensor data and other data corresponding to the time of the collision, may be transmitted to and stored by model generation server 220.”). System claim 15 is rejected for the same reasons as method claim 2 for having similar limitations and being similar in scope. In re claim 3, Barfield and Petersen teach all of the limitations of claim 2 stated above where Barfield further teaches further comprising: generating, by the at least one processor, a set of features by executing a feature extraction function using at least one of the inertial sensor data or the speed captured during the time period; wherein generating the indication that the detected vehicle event is a collision event or the non-collision event is based on the set of features (Para [0051]: “As shown in FIG. 7, the acceleration magnitude value of 3 g (“3 g Threshold”) has been chosen as the threshold crossing point around which features are extracted. The index at which the acceleration magnitude first crosses the 3 g threshold is shown as i.sub.cross. Parameters k.sub.1 and k.sub.2 may be positive integers that are used to select a window that is dependent on where the acceleration magnitude first crosses the threshold. The beginning and ending of the relevant window is the marked by i.sub.start and i.sub.end. Sampled sensor data from i.sub.start to i.sub.end (illustrated via shading in FIG. 7) may be stored and processed. Using a threshold crossing point to define a window from which features are extracted may allow for a direct comparison between different potential collision events.”). In re claim 6, Barfield and Petersen teach all of the limitations of claim 1 stated above where Barfield further teaches wherein the collision event comprises at least one of a front collision, a rear collision, a left collision, a right collision, a low-clearance collision, a bird or animal collision, or a topple event (Para [0060]: “Detection-of-collision determinations may alternatively or additionally include generation or detection of other information relating to a potential collision of a given vehicle, such as: the direction of the collision (e.g., the direction of the other vehicle relative to the given vehicle), the speed of the given vehicle before the collision, or other information that may be useful in evaluating the potential severity of the collision.”). System claim 19 is rejected for the same reasons as method claim 6 for having similar limitations and being similar in scope. In re claim 7, Barfield and Petersen teach all of the limitations of claim 1 stated above where Petersen further teaches further comprising identifying, by the at least one processor, a region within which vehicle events are to be classified as non-collision events (Col 21, lines 23-42: “As another example that may be implemented in some embodiments, the road surface feature assessment operation at block 306 may be implemented by a collision detection facility to analyze the data describing the vehicle in connection with road surface features (e.g., railway tracks or other road impediments) to determine whether the vehicle is near a road surface feature that is prone to causing acceleration values of vehicles traveling therethrough to be detected as acceleration events. In such case, at least a criterion may be related to whether the vehicle traverses through a road surface feature that is prone to causing acceleration events. The criterion may be implemented as, for example, one or more rules to which the data describing the vehicle may be compared. For example, a location of the vehicle may be used to determine whether the location of the vehicle matches a known road surface feature prone to causing acceleration events. Examples of a match of the locations include a scenario when the closest distance between the location of the vehicle and the known road surface feature is less than a threshold distance.”) by executing a clustering technique using a plurality of vehicle events (Col 48, lines 7-15: “Once the data is separated in block 1204, in block 1206 the different categories of collisions or other information may be labeled with whether they reflect a collision or characteristics of the type of collision they reflect (e.g., severity, angle of impact). In block 1208, the labeled data may then be separated into clusters by a machine learning engine and features identified with each cluster identified by the machine learning engine, to define the clusters and define the classes.”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Barfield and Petersen to further incorporate the teachings of Petersen to provide further comprising identifying, by the at least one processor, a region within which vehicle events are to be classified as non-collision events with the AUTOMATIC VEHICLE CRASH DETECTION USING ONBOARD DEVICES of Barfield as modified by Petersen. Doing so enables determining whether the vehicle is near a road surface feature that is prone to causing acceleration values of vehicles traveling therethrough to be detected as acceleration events, as recognized by Petersen (Col 21, lines 23-42). System claim 20 is rejected for the same reasons as method claim 7 for having similar limitations and being similar in scope. In re claim 8, Barfield and Petersen teach all of the limitations of claim 7 stated above where Petersen further teaches wherein determining (320) by the at least one processor whether the location of the vehicle associated with the detected event is within a distance of a non-collision cluster point comprises: determining, by the at least one processor, whether the vehicle event corresponds to a non-collision event based on the location of the vehicle being within the region (Col 21, lines 36-42: “For example, a location of the vehicle may be used to determine whether the location of the vehicle matches a known road surface feature prone to causing acceleration events. Examples of a match of the locations include a scenario when the closest distance between the location of the vehicle and the known road surface feature is less than a threshold distance.”). System claim 21 is rejected for the same reasons as method claim 8 for having similar limitations and being similar in scope. In re claim 11, Barfield and Petersen teach all of the limitations of claim 1 stated above where Barfield further teaches further comprising transmitting, by the at least one processor, the notification to the vehicle to indicate the event subclass to an operator of the vehicle (Para [0018]: “In response to the detection of a potential vehicle collision, the telematics device may alert a call center, such as by signaling the call center via a wireless network, such as a cellular wireless network (at 1.2, “Alert Call Center”). An operator, at the call center, may determine how to best handle the alert. For example, the operator may speak to a driver of the vehicle ask the driver whether the driver needs assistance. In this example, assume that the call center operator determines that emergency response personnel are needed (e.g., the driver may confirm that there has been a non-trivial vehicle collision and/or the driver may fail to respond to voice prompts from the call center operator). In this case, the call center operator may communicate with an emergency response center (e.g., a 911 response center, a police, fire, or ambulance team local to the vehicle collision, etc.) to provide information relevant to the emergency response personnel (e.g., the location of the vehicle collision, the potential severity of the vehicle collision, etc.) (at 1.3, “Dispatch Emergency Response Personnel”).” and para [0077]: “Output component 1150 may include a mechanism that outputs information to the operator, such as a display, a speaker, one or more light emitting diodes (LEDs), etc.”). System claim 23 is rejected for the same reasons as method claim 11 for having similar limitations and being similar in scope. In re claim 12, Barfield and Petersen teach all of the limitations of claim 11 stated above where Barfield further teaches wherein the notification is transmitted in response to classifying the vehicle event as the collision event (Para [0073]: “When the evaluation indicates a collision (block 1060—Yes), process 1000 may include alerting the call center of the collision (block 1070).”). Claims 4-5 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Barfield (US Patent No. 20160094964 A1), in view of Petersen (US Patent No. 11862022 B2) and further in view of Gilbert (US Patent No. 20170050599 A1). In re claim 4, Barfield and Petersen teach all of the limitations of claim 3 stated above but fails to teach wherein the feature extraction function comprises a wavelet transform function. However, Gilbert teaches wherein the feature extraction function comprises a wavelet transform function (Para [0009]: “a pattern matching processor configured to (i) receive motion sensor data from one or more motion sensors on the motor vehicle, (ii) apply a wavelet transformation to the motion sensor data in order to identify features of transformed motion sensor data, (iii) compare one or more of the identified features of the transformed motion sensor data with templates in the template library and (iv) determine an event type based on the comparison.”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Barfield and Petersen to further incorporate the teachings of Gilbert to provide wherein the feature extraction function comprises a wavelet transform function with the AUTOMATIC VEHICLE CRASH DETECTION USING ONBOARD DEVICES of Barfield as modified by Petersen. Doing so enables determining an event type based on the comparison of one or more identified features of the transformed motion sensor data, as recognized by Gilbert (Para [0009]). System claim 16 is rejected for the same reasons as method claim 3 and method claim 4 for having similar limitations and being similar in scope. In re claim 5, Barfield, Petersen and Gilbert teach all of the limitations of claim 4 stated above where Gilbert further teaches wherein the wavelet transform function is applied to each of four time windows, wherein the four time windows comprise a center time window (412), a prior time window (407), a later time window (420), and a non-event time window (405) (Paras [0098]-[0099]: “FIGS. 4a and 4b illustrate longitudinal accelerometer data 402 and lateral accelerometer data 404 for a single impact event as a function of time. This accelerometer data is an example of motion sensor data that can be obtained from a 2D or 3D accelerometer. The longitudinal accelerometer data 402 provides a waveform with a first peak 406 and a second peak 408. The first peak 406 has a maximum amplitude of around −0.1 to −0.2 g.” “The start of the second peak 408 follows the first peak by about 50 ms. The second peak 408 has a full width at half maxima of about 80 ms and a maximum amplitude of +2.5 g. The lateral accelerometer data 404 provides a waveform with a third peak 410 and a series of peaks 412. The third peak 410 occurs at a similar time as the first peak 406 and has a maximum amplitude of around 0.1 to 0.2 g. The series of peaks 412 occurs at a similar time to the second peak 408 and has a maximum amplitude of around 0.1 to 0.2 g.”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Barfield, Petersen and Gilbert to further incorporate the teachings of Gilbert to provide wherein the wavelet transform function is applied to each of four time windows, wherein the four time windows comprise a center time window (412), a prior time window (407), a later time window (420), and a non-event time window (405) with the AUTOMATIC VEHICLE CRASH DETECTION USING ONBOARD DEVICES of Barfield as modified by Petersen and Gilbert. Doing so enables generating a template for the type T of impact event to which the motion sensor data relates, as recognized by Gilbert (Para [0100]). Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Barfield (US Patent No. 20160094964 A1), in view of Petersen (US Patent No. 11862022 B2) and further in view of Lang (US Patent No. 20120265406 A1). In re claim 13, Barfield and Petersen teach all of the limitations of claim 12 stated above where Barfield further teaches and the method further comprises transmitting the notification to an emergency services operator (Para [0018]: “In response to the detection of a potential vehicle collision, the telematics device may alert a call center, such as by signaling the call center via a wireless network, such as a cellular wireless network (at 1.2, “Alert Call Center”). An operator, at the call center, may determine how to best handle the alert. For example, the operator may speak to a driver of the vehicle ask the driver whether the driver needs assistance. In this example, assume that the call center operator determines that emergency response personnel are needed (e.g., the driver may confirm that there has been a non-trivial vehicle collision and/or the driver may fail to respond to voice prompts from the call center operator). In this case, the call center operator may communicate with an emergency response center (e.g., a 911 response center, a police, fire, or ambulance team local to the vehicle collision, etc.) to provide information relevant to the emergency response personnel (e.g., the location of the vehicle collision, the potential severity of the vehicle collision, etc.) (at 1.3, “Dispatch Emergency Response Personnel”).”). The combination fails to teach wherein the event subclass indicates that the vehicle toppled. However, Lang teaches wherein the event subclass indicates that the vehicle toppled (Para [0002]: “The signal characteristics and the change in speed in both longitudinal and lateral directions are evaluated via the acceleration sensors; the continuation of a vehicle rollover movement about the longitudinal axis is evaluated via the rolling rate; two-dimensional collision contacts are detected quickly via the pressure sensors, and the collision speed and collision overlap are detected essentially via forward-looking sensors. Conventionally, both the evaluation algorithms and the sensor configuration are designed and applied on the basis of standardized crash tests.”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the combination of Barfield and Petersen to further incorporate the teachings of Lang to provide wherein the event subclass indicates that the vehicle toppled with the AUTOMATIC VEHICLE CRASH DETECTION USING ONBOARD DEVICES of Barfield as modified by Petersen. Doing so enables the continuation of a vehicle rollover movement about the longitudinal axis to be evaluated via the rolling rate, as recognized by Lang (Para [0002]). Allowable Subject Matter Claims 9-10 and 22 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: The prior art of record does not expressly teach or render obvious, in the context of the claims taken as a whole: Regarding claim 9, further comprising updating, by the at least one processor, a size of the region in response to detecting further non-collision events within or proximate to the region. System claim 22 is allowed for the same reasons as method claim 9 for having similar limitations and being similar in scope. Regarding claim 10, further comprising suppressing, by the at least one processor, the notification if the vehicle event is determined to correspond to the non- collision event based on the location of the vehicle being within the region. Moreover, modifying the prior art to achieve the claim limitation can only be achieved by hindsight, as no other reference includes these claims limitations. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES EDWARD MUNION whose telephone number is (571)270-0437. The examiner can normally be reached Monday-Friday 7:30-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Steven Lim can be reached at 571-270-1210. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES E MUNION/Examiner, Art Unit 2688 01/08/2026
Read full office action

Prosecution Timeline

Nov 14, 2024
Application Filed
Jan 08, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602988
TESTING OF DETECTION AND WARNING FUNCTIONS OF INTERCONNECTED SMOKE, HEAT AND CARBON MONOXIDE ALARMS BY SINGLE PERSON
2y 5m to grant Granted Apr 14, 2026
Patent 12582095
SYSTEMS, METHODS AND DEVICES FOR COMMUNICATION
2y 5m to grant Granted Mar 24, 2026
Patent 12560268
CONDUIT SECURITY TECHNIQUES
2y 5m to grant Granted Feb 24, 2026
Patent 12562045
WEARABLE DEVICE USED AS DIGITAL POOL ATTENDANT
2y 5m to grant Granted Feb 24, 2026
Patent 12552473
CHAIN PIN ASSEMBLY
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
76%
Grant Probability
99%
With Interview (+23.5%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 135 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month