Prosecution Insights
Last updated: April 19, 2026
Application No. 18/805,109

COORDINATED AUTONOMOUS VEHICLE AUTOMATIC AREA SCANNING

Non-Final OA §101§102§DP
Filed
Aug 14, 2024
Examiner
OUELLETTE, JONATHAN P
Art Unit
3629
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
State Farm Mutual Automobile Insurance Company
OA Round
1 (Non-Final)
66%
Grant Probability
Favorable
1-2
OA Rounds
3y 9m
To Grant
96%
With Interview

Examiner Intelligence

Grants 66% — above average
66%
Career Allow Rate
755 granted / 1140 resolved
+14.2% vs TC avg
Strong +30% interview lift
Without
With
+30.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
35 currently pending
Career history
1175
Total Applications
across all art units

Statute-Specific Performance

§101
28.9%
-11.1% vs TC avg
§103
18.5%
-21.5% vs TC avg
§102
27.8%
-12.2% vs TC avg
§112
10.9%
-29.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1140 resolved cases

Office Action

§101 §102 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-20 have been cancelled and Claims 21-40 have been added; therefore, Claims 21-40 are currently pending in application 18/805,109. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a nonstatutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The USPTO internet Web site contains terminal disclaimer forms which may be used. Please visit http://www.uspto.gov/forms/. The filing date of the application will determine what form should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp. Claims 21-40 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 12,104,912. Although the claims at issue are not identical, they are not patentably distinct from each other because both inventions disclose equivalent elements for coordinating automatic passive searching using autonomous vehicle components. 18/805,109 US 12,104,912 21. A computer system for coordinating automatic passive searching using autonomous vehicle components, the computer system comprising: one or more processors; and one or more memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to: receive an indication of a situation triggering a passive search, the indication including a search area; identify a plurality of vehicles physically located within the search area; transmit an indication of search criteria to at least one of a plurality of computing devices associated with the plurality of vehicles; receive a response from at least one of the plurality of computing devices associated with the plurality of vehicles, the response including an indication of sensor data from at least one sensor of one or more sensors associated with the corresponding vehicle within the search area meeting the search criteria; and implement an action based upon the response. 22. The computer system of claim 21, wherein the executable instructions further cause the computer system to: receive global positioning system (GPS) data indicating current locations of each of the plurality of vehicles, wherein identifying the plurality of vehicles comprises comparing the GPS data associated with the plurality of vehicles to the search area. 23. The computer system of claim 21, wherein the executable instructions further cause the computer system to: receive GPS data indicating current locations of each of the plurality of vehicles and each of a second plurality of vehicles, wherein the GPS data associated with the second plurality of vehicles indicates the second plurality of vehicles are outside the search area; and compare the GPS data to the search area, wherein the identifying the plurality of vehicles comprises determining the GPS data associated with the plurality of vehicles indicates the plurality of vehicles are within the search area. 24. The computer system of claim 21, wherein the executable instructions further cause the computer system to identify the plurality of vehicles by polling the plurality of vehicles for GPS data. 25. The computer system of claim 21, wherein the indication of the search criteria causes the at least one of the plurality of computing devices to activate the at least one sensor associated with the corresponding vehicle and evaluate data generated by the at least one sensor to determine whether the data meets the search criteria. 26. The computer system of claim 21, wherein the indication of the situation includes one or more of a missing person alert, a stolen vehicle alert, or a person of interest alert. 27. The computer system of claim 21, wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data meeting the search criteria includes image data from a camera of the corresponding vehicle. 28. The computer system of claim 21, wherein implementing the action based upon the response comprises sending a message containing information regarding the response to a server associated with a third party. 29. A tangible, non-transitory computer-readable medium storing executable instructions for coordinating automatic passive searching using autonomous vehicle components that, when executed by at least one processor of a computer system, cause the computer system to: receive an indication of a situation triggering a passive search, the indication including a search area; identify a plurality of vehicles physically located within the search area; transmit an indication of search criteria to at least one of a plurality of computing devices associated with the plurality of vehicles; receive a response from at least one of the plurality of computing devices associated with the plurality of vehicles, the response including an indication of sensor data from at least one sensor of one or more sensors associated with the corresponding vehicle within the search area meeting the search criteria; and implement an action based upon the response. 30. The tangible, non-transitory computer-readable medium of claim 29, wherein the executable instructions further cause the computer system to: receive global positioning system (GPS) data indicating current locations of each of the plurality of vehicles, wherein identifying the plurality of vehicles comprises comparing the GPS data associated with the plurality of vehicles to the search area. 31. The tangible, non-transitory computer-readable medium of claim 29, wherein the executable instructions further cause the computer system to: receive GPS data indicating current locations of each of the plurality of vehicles and each of a second plurality of vehicles, wherein the GPS data associated with the second plurality of vehicles indicates the second plurality of vehicles are outside the search area; and compare the GPS data to the search area, wherein the identifying the plurality of vehicles comprises determining the GPS data associated with the plurality of vehicles indicates the plurality of vehicles are within the search area. 32. The tangible, non-transitory computer-readable medium of claim 29, wherein the executable instructions further cause the computer system to identify the plurality of vehicles by polling the plurality of vehicles for GPS data. 33. The tangible, non-transitory computer-readable medium of claim 29, wherein the indication of the search criteria causes the at least one of the plurality of computing devices to activate the at least one sensor associated with the corresponding vehicle and evaluate data generated by the at least one sensor to determine whether the data meets the search criteria. 34. The tangible, non-transitory computer-readable medium of claim 29, wherein the indication of the situation includes one or more of a missing person alert, a stolen vehicle alert, or a person of interest alert. 35. The tangible, non-transitory computer-readable medium of claim 29, wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data meeting the search criteria includes image data from a camera of the corresponding vehicle. 36. The tangible, non-transitory computer-readable medium of claim 29, wherein implementing the action based upon the response comprises sending a message containing information regarding the response to a server associated with a third party. 37. A computer-implemented method of automatic passive searching using autonomous vehicle components, comprising: receiving, at one or more processors of one or more servers, an indication of a situation triggering a passive search, the indication including a search area; identifying, by the one or more processors of the one or more servers, a plurality of vehicles physically located within the search area; transmitting, to a processor associated with one vehicle of the plurality of vehicles, an indication of search criteria; obtaining, by the processor associated with the one vehicle, sensor data at the one vehicle, based upon the indication of the search criteria; evaluating, by the processor associated with the one vehicle, the sensor data to determine whether the sensor data meets the search criteria; when the sensor data meets the search criteria at the one vehicle of the plurality of vehicles, transmitting, to the one or more processors of the one or more servers, a response including an indication of the sensor data meeting the search criteria; and implementing, by the one or more processors of the one or more servers, an action based upon the response. 38. The computer-implemented method of claim 37, wherein obtaining the sensor data comprises: determining whether the one vehicle is operating; when the one vehicle is operating, accessing the sensor data from one or more sensors of the one vehicle that are already active for the purpose of operating the one vehicle; and when the one vehicle is not operating, activating one or more sensors of the one vehicle to collect sensor data. 39. The computer-implemented method of claim 37, wherein the indication of the situation triggering the passive search also includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest, and wherein the sensor data includes image data from a camera of the one vehicle, and wherein evaluating the sensor data comprises determining the image data includes one or more of the following: a license plate displaying the license plate number, a face matching the biometric data for a plurality of facial features, or a nearby vehicle of the make, model, and color of the vehicle of interest. 40. The computer-implemented method of claim 37, wherein the processor of the one vehicle performs said obtaining, evaluating, and transmitting without notifying a vehicle occupant of the one vehicle. 10. A computer system for coordinating automatic passive searching using autonomous vehicle components, comprising: one or more processors; and one or more memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to: receive an indication of a situation triggering a passive search; determine a plurality of passive search parameters based upon the indication of the situation, including a search area; identify a plurality of vehicles traveling along routes on road segments within the search area; generate an indication of search criteria based upon the plurality of search parameters; transmit the indication of search criteria to at least one of a plurality of computing devices associated with the plurality of vehicles; receive a response from at least one of the plurality of computing devices associated with the plurality of vehicles, the response including an indication of sensor data meeting the search criteria from at least one sensor of one or more sensors associated with the corresponding vehicle while traveling along the respective route of the vehicle on the road segments within the search area; and implement an action based upon the response. 14. The computer system of claim 10, wherein the executable instructions further cause the computer system to: receive global positioning system (GPS) data indicating current locations of each of the plurality of vehicles and each of a second plurality of vehicles, wherein the GPS data associated with the second plurality of vehicles indicates locations outside the search area, and wherein identifying the plurality of vehicles includes determining the GPS data associated with the plurality of vehicles indicates locations within the search area. 16. The computer system of claim 10, wherein the indication of the search criteria causes the at least one of the plurality of computing devices to activate the one or more sensors associated with the corresponding vehicle and evaluate data generated by the one or more sensors to determine whether the data meets the search criteria. 11. The computer system of claim 10, wherein the indication of the situation includes one or more of a missing person alert, a stolen vehicle alert, or a person of interest alert. 12. The computer system of claim 10, wherein: the indication of the situation includes a location associated with the situation; and the search area is determined based upon the location. 13. The computer system of claim 10, wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data meeting the search criteria includes image data from a camera of the corresponding vehicle. 15. The computer system of claim 10, wherein implementing the action based upon the response includes sending a message containing information regarding the response to a server associated with a third party. 17. A tangible, non-transitory computer-readable medium storing executable instructions for coordinating automatic passive searching using autonomous vehicle components that, when executed by at least one processor of a computer system, cause the computer system to: receive an indication of a situation triggering a passive search; determine a plurality of passive search parameters based upon the indication of the situation, including a search area; identify a plurality of vehicles traveling along routes on road segments within the search area; generate an indication of search criteria based upon the plurality of search parameters: transmit the indication of search criteria to at least one of a plurality of computing devices associated with the plurality of vehicles; receive a response from at least one of the plurality of computing devices associated with the plurality of vehicles, the response including an indication of sensor data meeting the search criteria from at least one sensor of one or more sensors associated with the corresponding vehicle while traveling along the respective route of the vehicle on the road segments within the search area; and implement an action based upon the response. See Claims 6 and 14 18. The tangible, non-transitory computer-readable medium of claim 17, wherein: the indication of the situation includes a location associated with the situation; and the search area is determined based upon the location. 19. The tangible, non-transitory computer-readable medium of claim 17, wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data meeting the search criteria includes image data from a camera of the corresponding vehicle. 20. The tangible, non-transitory computer-readable medium of claim 17, wherein the implementing the action based upon the response includes sending a message containing information regarding the response to a server associated with a third party. 1. A computer-implemented method of automatic passive searching using autonomous vehicle components, comprising: receiving, at one or more processors of one or more servers, an indication of a situation triggering a passive search; determining, by the one or more processors of the one or more servers, a plurality of passive search parameters based upon the indication of the situation, including a search area; identifying, by the one or more processors of the one or more servers, a plurality of vehicles traveling along routes on road segments within the search area; generating, by the one or more processors of the one or more servers, an indication of search criteria based upon the plurality of search parameters; transmitting, to the processor associated with one vehicle of the plurality of vehicles, the indication of the search criteria, wherein the processor associated with the one vehicle of the plurality of vehicles is configured to: obtain sensor data from at least one of one or more sensors of the one vehicle based upon the indication of the search criteria while traveling along the routes on the road segments within the search area; evaluate the sensor data to determine whether the sensor data meets the search criteria; and when the sensor data meets the search criteria at the one vehicle of the plurality of vehicles, transmit to the one or more processors of the one or more servers, a response including an indication of the sensor data meeting the search criteria; and implementing, by the one or more processors of the one or more servers, an action based upon the response. 2. The computer-implemented method of claim 1, wherein the indication of the situation includes one or more of the following: a missing person alert, a stolen vehicle alert or a person of interest alert. 3. The computer-implemented method of claim 1, wherein: the indication of the situation includes a location associated with the situation; and the search area is determined based upon the location. 4. The computer-implemented method of claim 1, wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data includes image data from a camera of the vehicle. 5. The computer-implemented method of claim 4, wherein to determine whether the sensor data meets the search criteria includes to determine an image from the camera includes one or more of the following: a license plate displaying the license plate number, a face matching the biometric data for a plurality of facial features, or a nearby vehicle of the make, model, and color of the vehicle of interest. 6. The computer-implemented method of claim 1, further comprising: receiving, at the one or more processors of the one or more servers, global positioning system (GPS) data indicating current locations of each of the plurality of vehicles and each of a second plurality of vehicles, wherein the GPS data associated with the second plurality of vehicles indicates locations outside the search area, and wherein the identifying the plurality of vehicles includes determining the GPS data associated with the plurality of vehicles indicates locations within the search area. 7. The computer-implemented method of claim 1, wherein to obtain sensor data includes activating the at least one of the one or more sensors of the vehicle in response to receiving the indication of the search criteria. 8. The computer-implemented method of claim 1, wherein the implementing the action based upon the response includes sending a message containing information regarding the response to a server associated with a third party. 9. The computer-implemented method of claim 1, wherein the indication of search criteria is received, the sensor data is obtained and evaluated, and at least one of the plurality of vehicles communicates the response without notifying vehicle occupants of the plurality of vehicles. Claim Rejections – 35 USC §101 35 U.S.C. § 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 21-40 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to non-statutory subject matter, specifically an abstract idea. Claims 21-40 are directed to a judicial exception (i.e., abstract idea), without providing a practical application, and without providing significantly more. Under the 35 U.S.C. §101 subject matter eligibility two-part analysis, Step 1 addresses whether the claim is directed to one of the four statutory categories of invention, i.e., process, machine, manufacture, or composition of matter. See MPEP §2106.03. If the claim does fall within one of the statutory categories, it must then be determined in Step 2A [prong 1] whether the claim is directed to a judicial exception (i.e., law of nature, natural phenomenon, and abstract idea). See MPEP §2106.04. If the claim is directed toward a judicial exception, it must then be determined in Step 2A [prong 2] whether the judicial exception is integrated into a practical application. See MPEP §2106.04(d). Finally, if the judicial exception is not integrated into a practical application, it must additionally be determined in Step 2B whether the claim recites "significantly more" than the abstract idea. See MPEP §2106.05. Examiner note: The Office’s 2019 Revised Patent Subject Matter Eligibility Guidance (2019 PEG) is currently found in the Ninth Edition, Revision 10.2019 (revised June 2020) of the Manual of Patent Examination Procedure (MPEP), specifically incorporated in MPEP §2106.03 through MPEP §2106.07(c). Regarding Step 1, Claims 21-28 are directed toward an apparatus (system). Claims 29-36 are directed toward a computer program product having computer-readable tangible storage media (article of manufacture). Claims 37-40 are directed toward a process (method). Thus, all claims fall within one of the four statutory categories as required by Step 1. Regarding Step 2A [prong 1], Claims 21-40 are directed toward the judicial exception of an abstract idea. Independent claims 21, 29 and 37 are directed specifically to the abstract idea of situational data analysis. Regarding independent claims 21, 29 and 37, the underlined limitations emphasized below correspond to the abstract ideas of the claimed invention: A computer-implemented method of automatic passive searching using autonomous vehicle components, comprising: receiving, at one or more processors of one or more servers, an indication of a situation triggering a passive search, the indication including a search area; identifying, by the one or more processors of the one or more servers, a plurality of vehicles physically located within the search area; transmitting, to a processor associated with one vehicle of the plurality of vehicles, an indication of search criteria; obtaining, by the processor associated with the one vehicle, sensor data at the one vehicle, based upon the indication of the search criteria; evaluating, by the processor associated with the one vehicle, the sensor data to determine whether the sensor data meets the search criteria; when the sensor data meets the search criteria at the one vehicle of the plurality of vehicles, transmitting, to the one or more processors of the one or more servers, a response including an indication of the sensor data meeting the search criteria; and implementing, by the one or more processors of the one or more servers, an action based upon the response. As the underlined claim limitations above demonstrate, independent claims 21, 29 and 37 are directed to the abstract idea of Mental processes (concepts performed in the human mind (including an observation, evaluation, judgment, or opinion)); and Certain methods of organizing human activity (fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)). Dependent claims 22-28, 30-36, and 38-40 provide further details to the abstract idea of claims 21, 29 and 37 regarding the received data, therefore, these claims include mental processes and certain methods of organizing human activities for similar reasons provided above for claims 21, 29 and 37. After considering all claim elements, both individually and in combination and in ordered combination, it has been determined that the claims do not amount to significantly more than the abstract idea itself. Regarding Step 2A [prong 2], Claims 21-40 fail to integrate the recited judicial exception into any practical application. The claims recite additional limitations which are hardware or software elements or particular technological environment, such as a “computer system”, a “tangible, non-transitory computer-readable medium”, “autonomous vehicle components”, “automatic passive searching”, a “processor”, computer “memories”, “computing devices”, “sensors”, “global positioning system (GPS)”, “image data”, a, camera, “vehicles”, and a “server”. However, these limitations are not enough to qualify as “practical application” being recited in the claims along with the abstract idea since these limitations are merely invoked as a tool to perform instruction of an abstract idea in a particular technological environment and/or are generally linking the use of the abstract idea to a particular technological environment or field of use, and merely applying and abstract idea in a particular technological environment and merely limiting use of an abstract idea to a particular field or a technological environment do not provide practical application for an abstract idea (MPEP 2106.05 (f) & (h)). The claims do not amount to "practical application" for the abstract idea because they neither (1) recite any improvements to another technology or technical field; (2) recite any improvements to the functioning of the computer itself; (3) apply the judicial exception with, or by use of, a particular machine; (4) effect a transformation or reduction of a particular article to a different state or thing; (5) provide other meaningful limitations beyond generally linking the use of the judicial exception to a particular technological environment. The relevant question under Step 2A [prong 2] is not whether the claimed invention itself is a practical application, instead, the question is whether the claimed invention includes additional elements beyond the judicial exception that integrate the judicial exception into a practical application by imposing a meaningful limit on the judicial exception. This is not the case with Applicant’s claimed invention. Automating the recited claimed features as a combination of computer instructions implemented by computer hardware and/or software elements as recited above does not qualify an otherwise unpatentable abstract idea as patent eligible. Examples where the Courts have found selecting a particular data source or type of data to be manipulated to be insignificant extra-solution activity include selecting information, based on types of information and availability of information in a power-grid environment, for collection, analysis and display, Electric Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1354-55, 119 USPQ2d 1739, 1742 (Fed. Cir. 2016); Applicant’s limitations as recited above do nothing more than supplement the abstract idea using additional hardware/software computer components as a tool to perform the abstract idea and generally link the use of the abstract idea to a technological environment, which is not sufficient to integrate the judicial exception into a practical application since they do not impose any meaningful limits. Dependent claims 22-28, 30-36, and 38-40 merely incorporate the additional elements recited above, along with further embellishments of the abstract idea of independent claims respectively, but these features only serve to further limit the abstract idea of independent claims. Therefore, the additional elements recited in the claimed invention individually, and in combination fail to integrate the recited judicial exception into any practical application. Regarding Step 2B, Claims 21-40 fail to amount to “significantly more” than an abstract idea. The claims recite additional limitations which are hardware or software elements or particular technological environment, such as a “computer system”, a “tangible, non-transitory computer-readable medium”, “autonomous vehicle components”, “automatic passive searching”, a “processor”, computer “memories”, “computing devices”, “sensors”, “global positioning system (GPS)”, “image data”, a, camera, “vehicles”, and a “server”. However, these limitations are not enough to qualify as “significantly more” being recited in the claims along with the abstract idea since these limitations are merely invoked as a tool to perform instruction of Abstract idea in a particular technological environment and/or are generally linking the use of the abstract idea to a particular technological environment or field of use, and merely applying and abstract idea in a particular technological environment and merely limiting use of an abstract idea to a particular field or a technological environment do not provide significantly more to an abstract idea (MPEP 2106.05(f) & (h)). The claims do not amount to "significantly more" than the abstract idea because they neither (1) recite any improvements to another technology or technical field; (2) recite any improvements to the functioning of the computer itself; (3) apply the judicial exception with, or by use of, a particular machine; (4) effect a transformation or reduction of a particular article to a different state or thing; (5) add a specific limitation other than what is well-understood, routine and conventional in the field; (6) add unconventional steps that confine the claim to a particular useful application; nor (7) provide other meaningful limitations beyond generally linking the use of the judicial exception to a particular technological environment. Dependent claims 22-28, 30-36, and 38-40 merely recite further additional embellishments of the abstract idea of independent claims 21, 29 and 37 respectively, but these features only serve to further limit the abstract idea of independent claims 21, 29 and 37; however, none of the dependent claims recite an improvement to a technology or technical field or provide any meaningful limits. The addition of another abstract concept to the limitations of the claims does not render the claim other than abstract. Under the Interim Guidance on Patent Subject Matter Eligibility (PEG 2019), it specifically states that narrowing an abstract idea of claims do not resolve the claims of being "significantly more" than the abstract idea. Thus, the additional elements in the dependent claims only serve to further limit the abstract idea utilizing the computer components as a tool and/or generally link the use of the abstract idea to a particular technological environment. Therefore, since there are no limitations in the claims 21-40 that transform the exception into a patent eligible application such that the claims amount to significantly more than the exception itself, and looking at the limitations as a combination and as an ordered combination adds nothing that is not already present when looking at the elements taken individually, claims 21-40 are rejected under 35 USC § 101 as being directed to non-statutory subject matter under 35 U.S.C. § 101. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 21-40 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Lee et al. (E. Lee, E. -K. Lee, M. Gerla and S. Y. Oh, "Vehicular cloud networking: architecture and design principles," in IEEE Communications Magazine, vol. 52, no. 2, pp. 148-155, February 2014). As per independent Claims 21, 29, and 37, Lee discloses a computer system for coordinating automatic passive searching using autonomous vehicle components, the computer system comprising: one or more processors; and one or more memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to (A tangible, non-transitory computer-readable medium storing executable instructions for coordinating automatic passive searching using autonomous vehicle components that, when executed by at least one processor of a computer system, cause the computer system to )(A computer-implemented method of automatic passive searching using autonomous vehicle components, comprising) (See at least Pg.148, “This article especially examines how VANET evolves with two emerging paradigms: vehicular cloud computing and information-centric networking. VCC brings the mobile cloud model to vehicular networks and thus changes the way of network service provisioning, whereas ICN changes the notion of data routing and dissemination. We envision a new vehicular networking system, vehicular cloud networking, on top of them.”): receive an indication of a situation triggering a passive search, the indication including a search area (See at least Pg.148, “Application Content Time-Space Validity — Vehicles produce a great amount of content, while at the same time consuming the content. That is, they become rich data “prosumers.” Such contents show several common properties of local relevance [1, 2]: local validity, explicit lifetime, and local interest. Local validity indicates that vehicle-generated content has its own spatial scope of utility to consumers. In safety applications, for instance, a speed warning message near a sharp corner is only valid to vehicles approaching the corner, say within 100 m. Explicit lifetime reflects the fact that vehicle content has its own temporal scope of validity. This also implies that the content must be available during its entire lifetime. For instance, road congestion information may be valid for 30 min, while the validity of a roadwork warning must last until the work is finished. Local interest indicates that nearby vehicles represent the bulk of potential content consumers.”); identify a plurality of vehicles physically located within the search area; transmit an indication of search criteria to at least one of a plurality of computing devices associated with the plurality of vehicles; receive a response from at least one of the plurality of computing devices associated with the plurality of vehicles, the response including an indication of sensor data from at least one sensor of one or more sensors associated with the corresponding vehicle within the search area meeting the search criteria (See at least Fig.3; Pg.149, “ …vehicle applications flood query messages to a local area, not to a specific vehicle, accepting responses regardless of the identity of the content providers. In fact, the response may come from a vehicle in the vicinity that has in turn received such traffic information indirectly through neighboring vehicles. In this case, the vehicle does not care who started the broadcast. This characteristic is mainly due to the fact that the sources of information (vehicles) are mobile and geographically scattered.”; and Pg.151, “For instance, the search range can be a predefined distance, a road section, or an intersection. Having determining the set of necessary resource types, the cloud leader broadcasts a resource request message, RREQ, to nodes within the search range. Nodes willing to share their resources send a resource reply message, RREP, back to the leader with information on their resource capabilities.”); and implement an action based upon the response (See at least Pg.149, “Vehicle applications collect such sensor records, even from neighboring vehicles, to produce value-added services. In MobEyes [3], for example, vehicles use a few sensors (including a video camera) to record all surrounding events such as car accidents while driving. Thereafter, Internet agents and/or mobile agents (e.g., police) search the vehicular network for witnesses as part of their investigation.” As per Claims 22 and 30, Lee discloses wherein the executable instructions further cause the computer system to: receive global positioning system (GPS) data indicating current locations of each of the plurality of vehicles, wherein identifying the plurality of vehicles comprises comparing the GPS data associated with the plurality of vehicles to the search area (See at least Pgs.150 and 153, Geolocation data incorporated). As per Claims 23 and 31, Lee discloses wherein the executable instructions further cause the computer system to: receive GPS data indicating current locations of each of the plurality of vehicles and each of a second plurality of vehicles, wherein the GPS data associated with the second plurality of vehicles indicates the second plurality of vehicles are outside the search area; and compare the GPS data to the search area, wherein the identifying the plurality of vehicles comprises determining the GPS data associated with the plurality of vehicles indicates the plurality of vehicles are within the search area (See at least Pgs.150-153). As per Claims 24 and 32, Lee discloses wherein the executable instructions further cause the computer system to identify the plurality of vehicles by polling the plurality of vehicles for GPS data (See at least Pgs.150-153). As per Claims 25 and 33, Lee discloses wherein the indication of the search criteria causes the at least one of the plurality of computing devices to activate the at least one sensor associated with the corresponding vehicle and evaluate data generated by the at least one sensor to determine whether the data meets the search criteria (See at least Pg.151, “The sensor is able to self-actuate as well as detect events in the physical world. With technological advancement, each sensor is directly connected so that external systems can read the sensor data and/or control the sensor”). As per Claims 26 and 34, Lee discloses wherein the indication of the situation includes one or more of a missing person alert, a stolen vehicle alert, or a person of interest alert (See at least Pg.149, Systems designed to help police investigation, specific type of investigation is considered by the Examiner to be a design choice). As per Claims 27 and 35, Przybylko discloses wherein: the indication of the search criteria includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest; and the sensor data meeting the search criteria includes image data from a camera of the corresponding vehicle (Pg.149, “… vehicles use a few sensors (including a video camera) to record all surrounding events such as car accidents while driving.”). As per Claims 28 and 36, Lee discloses wherein implementing the action based upon the response comprises sending a message containing information regarding the response to a server associated with a third party (Pg.149, Police receive information). As per Claim 38, Lee discloses wherein obtaining the sensor data comprises: determining whether the one vehicle is operating; when the one vehicle is operating, accessing the sensor data from one or more sensors of the one vehicle that are already active for the purpose of operating the one vehicle; and when the one vehicle is not operating, activating one or more sensors of the one vehicle to collect sensor data (Pgs.151-153). As per Claim 39, Lee discloses wherein the indication of the situation triggering the passive search also includes one or more of a license plate number, biometric data identifying a person of interest, or a make, model, or color of a vehicle of interest, and wherein the sensor data includes image data from a camera of the one vehicle, and wherein evaluating the sensor data comprises determining the image data includes one or more of the following: a license plate displaying the license plate number, a face matching the biometric data for a plurality of facial features, or a nearby vehicle of the make, model, and color of the vehicle of interest (See at least Pg.149, Surround event data include vehicle data). As per Claim 40, Lee discloses wherein the processor of the one vehicle performs said obtaining, evaluating, and transmitting without notifying a vehicle occupant of the one vehicle (Pgs.151-153). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure can be found in the PTO-892 Notice of References Cited. The Examiner suggests the applicant review all of these documents before submitting any amendments. Abdessamed et al. (D. Abdessamed and M. Samira, "Target Tracking in VANETs Using V2I and V2V Communication," 2014 International Conference on Advanced Networking Distributed Systems and Applications, Bejaia, Algeria, 2014, pp. 19-24) – See at least Pgs.21-22. U. Lee et al. (U. Lee et al., “MobEyes: Smart Mobs for Urban Monitoring with a Vehicular Sensor Network,” UCLA CSD, Tech. Rep., 2006.) – See at least Pgs.52-55. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN P OUELLETTE whose telephone number is (571)272-6807. The examiner can normally be reached on M-F 8am-6pm. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lynda C Jasmin, can be reached at telephone number (571) 272-6782. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. January 26, 2026 /JONATHAN P OUELLETTE/Primary Examiner, Art Unit 3629
Read full office action

Prosecution Timeline

Aug 14, 2024
Application Filed
Jan 24, 2026
Non-Final Rejection — §101, §102, §DP
Apr 15, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591860
OPERATIONAL SIMULATIONS OF PLANNED MAINTENANCE FOR VEHICLES
2y 5m to grant Granted Mar 31, 2026
Patent 12586043
Social Match Platform Apparatus, Method, and System
2y 5m to grant Granted Mar 24, 2026
Patent 12586038
INTELLIGENT SYSTEM AND METHOD OF OPTIMIZING CROSS-TEAM INFORMATION FLOW
2y 5m to grant Granted Mar 24, 2026
Patent 12572599
SYSTEMS AND METHODS OF GENERATING DYNAMIC ASSOCIATIONS BASED ON USER OBJECT ATTRIBUTES
2y 5m to grant Granted Mar 10, 2026
Patent 12567037
LEARNING ACCELERATION USING INSIGHT-ASSISTED INTRODUCTIONS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
66%
Grant Probability
96%
With Interview (+30.0%)
3y 9m
Median Time to Grant
Low
PTA Risk
Based on 1140 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month