Prosecution Insights
Last updated: April 19, 2026
Application No. 18/142,437

RARE EXAMPLE MINING FOR AUTONOMOUS VEHICLES

Non-Final OA §101§102
Filed
May 02, 2023
Examiner
SOOFI, YAZAN A
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Waymo LLC
OA Round
1 (Non-Final)
89%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 89% — above average
89%
Career Allow Rate
720 granted / 809 resolved
+37.0% vs TC avg
Moderate +11% lift
Without
With
+11.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
19 currently pending
Career history
828
Total Applications
across all art units

Statute-Specific Performance

§101
18.0%
-22.0% vs TC avg
§103
31.0%
-9.0% vs TC avg
§102
38.6%
-1.4% vs TC avg
§112
9.3%
-30.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 809 resolved cases

Office Action

§101 §102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA Status of Claims Claims 1-18 of U.S. Application No. 18/142,437 filed on 05/02/2023 have been examined. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1- 18 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea ) without significantly more . 101 Analysis- Step 1 The claims are directed to a method for processing the sensor input … to generate density score and rareness score (i.e., a process). Therefore, claim 1-18 is within at least one of the four statutory categories. 101 Analysis – Step 2A, Prong I Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claim 1 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites: A method performed by one or more computers, the method comprising: maintaining a plurality of density estimation models that each correspond to a different rareness type with respect to historical sensor inputs in a driving log generated by sensors on- board a vehicle; receiving a query that references a sensor input; generating, from the sensor input, a corresponding density estimation model input for each of the plurality of density estimation models; processing, using each of the plurality of density estimation models, the corresponding density estimation model input to generate a corresponding density score; generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query. The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. For example, “maintaining…”, “receiving…”, “processing…”, “generating…”, “providing…” in the context of this claim processing and determining a mathematical result based on collected data. Accordingly, the claim recites at least one abstract idea. 101 Analysis – Step 2A, Prong II Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”): The claim recites processing the sensor input … to generate density score and rareness score (this limitation is a mental process since a human can mentally generate feature vectors from an input). The claim recites process ing each of the plurality of density estimation models, the corresponding density estimation model input to generate a corresponding density score (this limitation is both a mathematical equation since the equation for the density estimation model is given with the specification. It is also a mental process since a human can use that equation to calculate the density score). The claim recites generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query (this limitation is both a mathematical equation since the equation for the density estimation model is given with the specification. It is also a mental process since a human can use that equation to calculate the density score). Therefore, claim 1 recites an abstract idea. For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. Regarding the additional limitations of “ generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query ” the examiner submits that these limitations are insignificant extra-solution activities that merely use a computer to perform the process. In particular, the receiving steps from the sensors and from the external source are recited at a high level of generality , and generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type; and providing the rareness scores in response to receiving the query , which is a form of insignificant extra-solution activity. Lastly, the data processing steps merely describes how to generally “maintaining…”, “receiving…”, “processing…”, “generating…”, “providing…” the otherwise mental judgements in a generic or general purpose computer . The vehicle control system is recited at a high level of generality and merely automates the evaluating step. Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. 101 Analysis – Step 2B Regarding Step 2B of the 2019 PEG, representative independent claim 1 does not include additional elements that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a processor to perform the “maintaining…”, “receiving…”, “processing…”, “generating…”, amounts to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations of “ sensors ” the examiner submits that these limitations are insignificant extra-solution activities. Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of “ sensors …,” are well-understood, routine, and conventional activities because the background recites that the sensors are all conventional sensors mounted on the vehicle, and the specification does not provide any indication that the vehicle controller is anything other than a conventional computer within a vehicle. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLCv. Symantec Corp. , 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC , 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc. , 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner. The additional limitation of “ sensors …,” is a well-understood, routine, and conventional activity because the Federal Circuit in Trading Techs. Int’l v. IBG LLC , 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co. , 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function. Hence, the claim is not patent eligible. Dependent claim(s) 2- 5 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application [provide concise explanation]. Therefore, dependent claims 2- 5, 8-12, 14-18 are not patent eligible under the same rationale as provided for in the rejection of claim 1. Therefore, claim(s) 1-1 8 is/are ineligible under 35 USC §101. Regarding claim 7 applicant recites a computer system performing functionalities identical to those of the method of claim 1. The integration of a computer system in claim 7 does not integrate the judicial exception of claim 1 into a practical application of that exception or amount to significantly more than the judicial exception. Regarding claim 8 applicant recites a non-transitory computer-readable storage media performing functionalities identical to those of the method of claim 1. The integration of a non-transitory computer-readable storage media in claim 8 does not integrate the judicial exception of claim 1 into a practical application of that exception or amount to significantly more than the judicial exception. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1 - 18 is/are rejected under 35 U.S.C. 102 FILLIN "Insert either \“(a)(1)\” or \“(a)(2)\” or both. If paragraph (a)(2) of 35 U.S.C. 102 is applicable, use form paragraph 7.15.01.aia, 7.15.02.aia or 7.15.03.aia where applicable." \d "[ 2 ]" (a)(1) as being FILLIN "Insert either—clearly anticipated—or—anticipated—with an explanation at the end of the paragraph." \d "[ 3 ]" anticipated by FILLIN "Insert the prior art relied upon." \d "[ 4 ]" Cobb et al. (US 20110052068 A1) (hereafter referred to as Cobb) . Regarding claim 1 , Cobb teaches a method performed by one or more computers, the method comprising: maintaining a plurality of density estimation models that each correspond to a different rareness type with respect to historical sensor inputs in a driving log generated by sensors on- board a vehicle (Cobb, paragraph 0030, “Network 110 receives video data (e.g., video stream(s), video images, or the like) from the video input source 105. The video input source 105 may be a video camera, a VCR, DVR, DVD, computer, web-cam device, or the like. For example, the video input source 105 may be a stationary video camera aimed at a certain area (e.g., a subway station, a parking lot, a building entry/exit, etc.), which records the events taking place therein”) ; receiving a query that references a sensor input (Cobb, paragraph 0030, “Network 110 receives video data (e.g., video stream(s), video images, or the like) from the video input source 105. The video input source 105 may be a video camera, a VCR, DVR, DVD, computer, web-cam device, or the like. For example, the video input source 105 may be a stationary video camera aimed at a certain area (e.g., a subway station, a parking lot, a building entry/exit, etc.), which records the events taking place therein”) ; generating, from the sensor input, a corresponding density estimation model input for each of the plurality of density estimation models ( Cobb, paragraph 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) ; processing, using each of the plurality of density estimation models, the corresponding density estimation model input to generate a corresponding density score (Cobb, paragraph 0037, “The context processor component 220 may receive the output from other stages of the pipeline (i.e., the tracked objects and the background and foreground models) . Using this information, the context processor 220 may be configured to generate a stream of micro-feature vectors corresponding to foreground patches tracked (by tracker component 210)” Examiner notes that micro-feature vectors can be mapped to feature maps) ; generating, for the sensor input, and from the density scores, a rareness score associated with each different rareness type (Cobb, paragraph 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) ; and providing the rareness scores in response to receiving the query (Cobb, paragraph 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) . Regarding claim 2 , Cobb teaches a method , further comprising: maintaining a plurality of embeddings that are generated by neural networks that correspond respectively to the different types of rareness from processing the historical sensor inputs in the driving log generated by sensors on-board the vehicle (Cobb, paragraph 0025, 0029, 0030 , 0032 , 0043 and 0046 “Network 110 receives video data (e.g., video stream(s), video images, or the like) from the video input source 105. The video input source 105 may be a video camera, a VCR, DVR, DVD, computer, web-cam device, or the like. For example, the video input source 105 may be a stationary video camera aimed at a certain area (e.g., a subway station, a parking lot, a building entry/exit, etc.), which records the events taking place therein”) ; and for each different type of rareness: selecting, from the plurality of embeddings, one or more similar embeddings based on similarities of the embeddings with respect to the sensor input (Cobb, paragraph 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) ; identifying historical sensor inputs from which a corresponding neural network generated the one or more similar embeddings ( Cobb, paragraph 0025, 0029, 0030 , 0032 , 0043, 0046, 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) ; and providing the identified historical sensor inputs in response to receiving the query ( Cobb, paragraph 0025, 0029, 0030 , 0032 , 0043, 0046, 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) . Regarding claim 3 , Cobb teaches a method , wherein the types of rareness comprise a first rareness type that represents a rareness in a category of an object depicted in the log of historical sensor inputs generated by sensors on-board the vehicle ( Cobb, paragraph 0025, 0029, 0030 , 0032 , 0043, 0046, 0064, “At step 476 the anomaly detection component 322 determines a rareness measure for the micro-feature vector. That is, the anomaly detection component 322 estimates a measure of the likelihood of observing the particular micro-feature vector, based on the probability density function and the probability micro-feature vector”) . Regarding claim 4 , Cobb teaches a method , wherein the rareness types comprise a second rareness type that represents a rareness in a predicted future trajectory of a target agent characterized in the log of historical sensor inputs (Cobb, paragraph 0040, “ the primitive event detector 212 may be configured to receive the output of the computer vision engine 135 (i.e., the video images, the micro-feature vectors, and context event stream) and generate a sequence of primitive events--labeling the observed actions or behaviors in the video with semantic meaning” and “for example, a sequence of primitive events related to observations of the computer vision engine 135 occurring at a parking lot could include language semantic vectors representing the following: "vehicle appears in scene," "vehicle moves to a given location," "vehicle stops moving," "person appears proximate to vehicle," "person moves," person leaves scene" "person appears in scene," "person moves proximate to vehicle," "person disappears," "vehicle starts moving," and "vehicle disappears."” Examiner notes that some of the language semantic vectors are data that describes a predicted future trajectory). Regarding claim 5 , Cobb teaches a method , wherein maintaining the plurality of embeddings comprises: maintaining timestamp metadata associated with log of the historical sensor inputs from which the plurality of embeddings are generated (Cobb, paragraph 00 31 , “ analyze this raw information to identify foreground patches depicting active objects in the video stream, extract micro-features, and derive a variety of metadata regarding the actions and interactions of such objects, and supply this information to a machine learning engine 140. In turn, the machine learning engine 140 may be configured to classify the objects, evaluate, observe, learn and remember details regarding events (and types of events) that transpire within the scene over time”) . Regarding claim 6 , Cobb teaches a method , further comprising: receiving text that describes contents of a sensor input; generating a textual embedding of the text; and for each rareness type: selecting, from the plurality of embeddings, one or more embeddings by using the textual embedding; identifying historical sensor inputs from which a corresponding neural network generated the one or more embeddings; and providing the identified historical sensor inputs in response to receiving the text Cobb, paragraph 0040 , 0043 and 0046 , “ the primitive event detector 212 may be configured to receive the output of the computer vision engine 135 (i.e., the video images, the micro-feature vectors, and context event stream) and generate a sequence of primitive events--labeling the observed actions or behaviors in the video with semantic meaning” and “for example, a sequence of primitive events related to observations of the computer vision engine 135 occurring at a parking lot could include language semantic vectors representing the following: "vehicle appears in scene," "vehicle moves to a given location," "vehicle stops moving," "person appears proximate to vehicle," "person moves," person leaves scene" "person appears in scene," "person moves proximate to vehicle," "person disappears," "vehicle starts moving," and "vehicle disappears."” Examiner notes that some of the language semantic vectors are data that describes a predicted future trajectory). . Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT YAZAN A SOOFI whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (469)295-9189 . The examiner can normally be reached on Flex schedule. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fadey Jabr can be reached on 572-272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YAZAN A SOOFI/ Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

May 02, 2023
Application Filed
Feb 06, 2026
Non-Final Rejection — §101, §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600426
A VEHICLE WITH ADVERTISING DISPLAY
2y 5m to grant Granted Apr 14, 2026
Patent 12602059
ROBOT, CONTROL METHOD FOR ROBOT, AND RECORDING MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12602062
REDUCING RESISTANCE TO MOVEMENT OF DEVICES THAT INCLUDE CASTERS
2y 5m to grant Granted Apr 14, 2026
Patent 12583482
SYSTEMS AND METHODS FOR MODE CONFUSION AVOIDANCE
2y 5m to grant Granted Mar 24, 2026
Patent 12585290
UNMANNED VEHICLE, SYSTEM OF CONTROLLING UNMANNED VEHICLE, AND METHOD OF CONTROLLING UNMANNED VEHICLE
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
89%
Grant Probability
99%
With Interview (+11.3%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 809 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month