Prosecution Insights
Last updated: April 19, 2026
Application No. 17/251,124

AUTONOMOUS VEHICLE SIMULATOR USING NETWORK PLATFORM

Final Rejection §103
Filed
Dec 10, 2020
Examiner
WECHSELBERGER, ALFRED H.
Art Unit
2187
Tech Center
2100 — Computer Architecture & Software
Assignee
Morai
OA Round
4 (Final)
58%
Grant Probability
Moderate
5-6
OA Rounds
3y 8m
To Grant
94%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
122 granted / 212 resolved
+2.5% vs TC avg
Strong +36% interview lift
Without
With
+36.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
42 currently pending
Career history
254
Total Applications
across all art units

Statute-Specific Performance

§101
30.0%
-10.0% vs TC avg
§103
38.9%
-1.1% vs TC avg
§102
3.8%
-36.2% vs TC avg
§112
24.0%
-16.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 212 resolved cases

Office Action

§103
DETAILED ACTION Claims 1 and 3 – 11 have been presented for examination. Claim 1 is currently amended. Claim 2 is cancelled. Claim 11 is new. This Office Action is in response to the amendments dated 08/01/2025. Response to Rejections under 35 USC § 112 Applicant’s amendments overcome the 112(b) rejection. Therefore, it is withdrawn. Response to Rejections under 35 USC § 103 Applicant’s arguments have been fully considered. However, the Office does not consider them to be persuasive. Applicant argues: “The Examiner acknowledges that Walther does not disclose that the data packets are input by different users. (Office Action, page 10). Thus, Walther does not teach or suggest "wherein the first, second, and third data packets are transmitted to a simulator, and the second data packet or the third data packet is input by another user," ( emphasis added) as recited in amended independent claim 1.” In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Specifically, Putz (Database) is relied upon to teach “Input by another user”. Applicant argues: “However, pages 4-5 of Putz merely describe different input sources for a database, and does not teach that different users input different data packets. In addition, the input data described in Putz is data generated as a result of simulation. Thus, the input data in Putz does not correspond to the second data packet of the second algorithm or the third data packet of the third algorithm/or simulating the autonomous vehicle in amended independent claim 1.” Applicant argues that Putz does not teach at least “data packets input by different users”. Examiner notes that Putz explicitly teaches that different users uploaded data sets (see Page 4, Right “In addition, user rights for each uploaded data sets are assigned to create individual data sharing options”). Further, the uploaded d7ata is not limited to being generated by simulation (see Page 4, Left “Data from field tests or naturalistic driving studies commonly have a high volume due to various sensor set-ups (image processing, Lidar scans, etc.) and no focus on a specific scenario.”). Applicant argues: “In contrast with the cited references, as shown in FIG. 10 of the present application below, the contents of each algorithm are expressed as source code, script, instructions, or parameters and stored in the database in the form of a data packet. In other words, perception algorithm a is expressed in the form of data packet 1, planning algorithm b is expressed in the form of data packet 2, and so on. In addition, a dataset corresponds to a set of each data packet by type (for example, dataset 1 in Figure 10 is a collection of perception algorithms (data packets)). A plurality of users input a plurality of data packets to data sets and a plurality of the data sets are collected and transmitted as autonomous vehicle data to the simulator, whereby the plurality of users may mutually share the data packets and operate their own data packets in more various operation algorithms. For example, by referring to FIG. 10, user 1 can test the data packet 1 of her perception algorithm with the combination of the data packet 1 input by user 1, the data packet 2 in the dataset 2 input by user 3, and the data packet c in the dataset 3 input by user 5. Thus, even if user 1 only developed a subset of an entire integrated algorithm, she can test her algorithm with the help of other sub-algorithms input by other users. In this regard, a plurality of users may more clearly verify their algorithms for simulating autonomous vehicles by sharing the data packets input by the plurality of users or by interworking with counterpart data packets under access authorization given to each of the plurality of users.” (emphasis added) In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., the algorithms are scripts in the form of a data packet stored in the database) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). Specifically, Applicant appears to argue the inventive concept and not the invention as claimed. Applicant argues: “New Claim 11 is patentable over the cited references for at least its dependence from claim 1. In addition, claim 11 is separately patentable over the cited references because the cited references do not teach or suggest "wherein each of the first algorithm, the second algorithm, and the third algorithm is one of an algorithm for perceiving objects by the autonomous vehicle, an algorithm for planning operations of the autonomous vehicle, and algorithm for controlling operations of the autonomous vehicle," as recited in claim 11.” Applicant argues that the cited references do not teach the specifically recited algorithms. The broadest reasonable interpretation is discussed in MPEP 2111. Examiner notes that the first data packet is merely “of a first algorithm” (and similarly for the second and third). Further, the claim does not explicitly limit how the algorithm are represented in combination with said data packet (see Applicant’s arguments “the contents of each algorithm are expressed as source code, script, instructions, or parameters and stored in the database in the form of a data packet.”). Therefore, the broadest reasonable interpretation of “first data packet of a first algorithm” includes data related to the algorithm, and is not limited to “the contents of each algorithm … expressed in the form of a data packet”. Walther explicitly teaches that the autonomous vehicle testing environment comprises Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: Determining the scope and contents of the prior art. Ascertaining the differences between the prior art and the claims at issue. Resolving the level of ordinary skill in the pertinent art. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 3 – 4 and 7 - 10 are rejected under 35 U.S.C. 103 as being unpatentable over Walther et al. (US 10599546) (henceforth “Walther (562)”) in view of Putz et al. “DATABASE APPROACH FOR THE SIGN-OFF PROCESS OF HIGHLY AUTOMATED VEHICLES” (henceforth “Putz (Database”), and further in view of AWS Documentation > AWS IoT > Developer Guide (henceforth “AWS (IoT”). Walther (546) and Putz (Database) and AWS (IoT) are analogous art because they solve the same problem of simulating autonomous vehicle behavior, and since they are in the same field of autonomous vehicles. With regard to claim 1, Walther (546) teaches an autonomous vehicle simulator system comprising: (Walther (546) Col. 3, Lines 25 – 30 “For instance a user can configure a test by creating a testing environment (e.g., simulated environment, test track environment, etc.) in which an autonomous vehicle ( e.g., simulated vehicle, real vehicle, etc.) will operate in accordance with the autonomous vehicle computing system”) a database storing (Walther (546) Col. 9, Lines 53 – 67 a database stores all the data to re-run a test on the autonomous vehicle “The testing system can store the associated data in an accessible memory such as, for example, a searchable database. …This can allow the testing system to quickly retrieve the associated data structure ( e.g., linked data structure) from the accessible memory. … Accordingly, the testing system can initiate any of the tests to obtain additional test result data”, and Col. 27, Lines 13 – 16 the database can be stored remotely “In some implementations, the computing device(s) 801 can obtain data from one or more memories that are remote from the testing system 100.”) a plurality of autonomous vehicle operation data for a plurality of autonomous vehicles, (Walther (546) Col. 3, Lines 19 - 40 scenarios of autonomous vehicle operations are used for testing, where such testing can be saved in the searchable database “Such tests can be conducted using an offline simulated environment and/or by deploying an autonomous vehicle on a test track … The testing scenario can be a situation used to test the one or more autonomous vehicle capabilities. The testing scenario can indicate testing parameters”) operation environment data (Walther (546) Col. 4, Lines 16 – 21 tests can be based on accessible driving log data “In some implementations, a test can be based on previously collected driving logs that were acquired from one or more autonomous vehicles deployed in the real-world. For such log-based testing, the testing system and/or a user can select driving log data (and/or a section thereof) associated with an event that occurred in the real-world.”, and Col. 7, Lines 7 – 14 driving log data contains information about the environment of the autonomous vehicle, ) wherein each of the autonomous vehicle operation data includes a plurality of data sets and each of the plurality of data sets includes a plurality of data packets; (Walther (546) Col. 4, Lines 38 – 41 data of a plurality of tests are obtained (each of the autonomous vehicle operation data includes a plurality of data sets) each having various inputs regarding the test (a plurality of data packets) “The testing system can obtain data indicative of one or more initial inputs associated with the simulated environment to help configure the test”, and Col. 27, Lines 59 – 63 various data are stored “The memory 824 can store data 828 that can be obtained and/or stored. The data 828 can include, for instance, sensor data, perception data, prediction data, motion planning data, driving log data, and/or other data/information as described herein”) an input module receiving a first data packet of a first algorithm from the user; (Walther (546) Col. 10, Lines 1 – 7 user can modify a test to include a newer software version through a user interface “Moreover, the testing system can enable the user to modify and/or re-run the test (e.g., to test a newer software version) by providing user input to the user interface. Accordingly, the testing system can initiate any of the tests to obtain additional test result data”, and Col. 9, Lines 15 – 19 the autonomy software stack generated prediction and planning data (first data packet of a first algorithm) “For example, while and/or after the test is running, the testing system can obtain feedback data from the autonomous vehicle computing system ( e.g., perception data, prediction data, motion planning data, etc. generated by the autonomy software stack).”) a control module selecting a second data packet of a second algorithm from a second data set of the plurality of data sets and a third data packet of a third algorithm from a third data set of the plurality of data sets, wherein the first, second, and third data packets are transmitted to a simulator; (Walther (546) Col. 9, Line 62 – Col. 10 , Line 6 user can query a database (a control module selecting a data packet) to run a simulation based on the retrieved results (is transmitted to the simulator), where multiple queries can be run with predictable results (a second/third data packet of a second/third algorithm)) the simulator configured to verify the first algorithm with respect to an autonomous vehicle based on first, second, and third data packets for the autonomous vehicle retrieved from the database, (Walther (546) Col. 3, Lines 25 – 30 the new software for the autonomous vehicle inputted by the user is tested “For instance, a user can configure a test by creating a testing environment (e.g., simulated environment, test track environment, etc.) in which an autonomous vehicle ( e.g., simulated vehicle, real vehicle, etc.) will operate in accordance with the autonomous vehicle computing system”, and Col. 9, Line 62 – Col. 10 , Line 6 the simulation is based on a user query (based on the first/second/third data packets)) wherein the first algorithm, the second algorithm, and the third algorithm are different algorithms for simulating the autonomous vehicle (Walther (546) Col. 10, Lines 1 – 7 each test for the autonomous vehicle testing environment can be different (different algorithms for simulating)) Walther (562) does not appear to explicitly disclose: that the data packets are input by different users; the second data packet or the third data packet is input by another user; a database management system (DBMS) controlling data input to and output; an input module receiving user identification information of a user; a security module assigning access authorization by matching the user identification information received by the input module to the user information in the database; and that the simulator configured to verify based on the autonomous vehicle operation data for the autonomous vehicle retrieved by the DBMS from the database. However, Putz (Database) teaches: data packets input by different users; a second data packet or a third data packet is input by another user; a database management system (DBMS) controlling data input to and output from a database; (Putz (Database) Page 4, Right input is processed before being entered into the database using a processing chain (DMBS controlling input to), and Page 5, Left the data processing chain results in specific outputs from the database after processing (controlling output from), and Page 4, Right data is uploaded by different users (data packets input by different users) “In addition, user rights for each uploaded data sets are assigned to create individual data sharing options”) an input module receiving user identification information of a user and; a security module assigning, to the user, access authorization to use data packets among the plurality of data packets in each of the plurality of data sets by assigning user rights to uploaded data sets (Putz (Database) Page 4, Right user rights are received (receiving user identification information) for the purpose of controlling data sharing access (assigning access authorization) “In addition, user rights for each uploaded data sets are assigned to create individual data sharing options”, and Page 4, Right a variety of interfaces are available for input data (modules)) It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle simulation system disclosed by Walther (5620) with the autonomous vehicle database information for simulation disclosed by Putz (Database). One of ordinary skill in the art would have been motivated to make this modification in order to desirably control access to autonomous vehicle data (Putz (Database) Page 2, Right virtual tests are less costly than real-world tests). Walther (562) in view of Putz (Database) does not appear to explicitly disclose: the database storing user information; a security module assigning, to the user, access authorization to use data packets among the plurality of data packets in each of the plurality of data sets by matching the user identification information received by the input module to the user information in the database; that the control module selects a data packet based on the access authorization to use the data packets assigned to the user. However, AWS (IoT) teaches: a database storing user information; a security module assigning to a user access authorization to use data packets among a plurality of data packets in each of a plurality of data sets by matching user identification information received by an input module to user information in a database; and (AWS (IoT) Section “Security and Identity for AWS IoT” user identities stored in the cloud (a database storing user information, and a security module assigned to a user access authorization) are used to provide access to a database based on received tokens (by matching user identification information received) “Devices connect using your choice of identity (X.509 certificates, IAM users and groups, Amazon Cognito identities, or custom authentication tokens) over a secure connection according to the AWS IoT connection model.”) a control module selecting a data packet from each of the plurality of data sets based on the access authorization to use the data packets assigned to the user (AWS (IoT) Section “Authorization” a user can be given access to any desired resources (selecting a data packet from each of the plurality of data sets based on the access authorization to use data packets assigned to the user)) It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle simulation system including database access control disclosed by Walther (5620) in view of Putz (Database) with the controlling access to data on cloud resources disclosed by AWS (IoT). One of ordinary skill in the art would have been motivated to make this modification in order to desirably control access to autonomous vehicle data stored in a database (AWS (IoT) Section “What is AWS IoT?”) (Putz (Database) Page 2, Right) With regard to claim 3, Walther (562) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 2, and further teaches: wherein one autonomous vehicle operation data is packaged into first data sets and second data sets, and (Walther (546) Col. 9, Lines 53 – 67 testing scenario data is stored as a data structure comprising multiple components (packaged into first and second data sets) “The testing system can store the associated data in an accessible memory such as, for example, a searchable database. … This can allow the testing system to quickly retrieve the associated data structure ( e.g., linked data structure) from the accessible memory.”) when the DBMS receives first data packets of the first data sets from one user terminal and transmits the first data packets to the simulator, the DBMS determines whether the second data sets in the database have second data packets, and when the second data packets within the second data sets do exist, the DBMS transmits the second data packets to the simulator. (Walther (546) Col. 9, Lines 53 – 67 a user can provide data related to elements of the testing scenario (first data packets of the first data sets) through a user terminal (from one user terminal), and related testing scenario elements are retrieved (when second data package within second data sets do exist, DBMS transmit to the simulator) “The testing system can store the associated data in an accessible memory such as, for example, a searchable database. … For example, a user can provide (e.g., via user input to a user interface) a search query indicative of at least one of the test, the testing scenario, or the one or more autonomous vehicle capabilities. In response, the testing system can access the memory and present a user interface indicating the test, its associated testing scenario, the tested autonomous vehicle capabilities, and/or any test results. Moreover, the testing system can enable the user to modify and/or re-run the test”) With regard to claim 4, Walther (546) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: wherein each of the plurality of autonomous vehicle operation data comprises (Walther (546) Col. 3, Lines 14 – 17 the autonomous vehicle stored data is based on outputs of a software stack “The autonomy software stack can enable the autonomous vehicle to perceive object(s) within its surrounding environment, predict the motion of those objects, and plan the motion of the autonomous vehicle, accordingly”) perception data set, a planning data set, and a control data set. (Walther (562) Col. 27, Lines 59 – 63 sensor and perception data (perception data set), and prediction data (planning data set), and motion planning data are stored “The memory 824 can store data 828 that can be obtained and/or stored. The data 828 can include, for instance, sensor data, perception data, prediction data, motion planning data, driving log data, and/or other data/information as described herein”, and Col. 14, Lines 7 – 14 the motion plan is used to directly control the vehicle (control data set) “In such a case, the autonomous vehicle computing system 104 can provide, to the simulation system 102, data indicative of instructions determined by the vehicle controller system based at least in part on the motion plan 216. The simulation system 102 can control the simulated autonomous vehicle based at least in part on the data indicative of the vehicle controller system instructions, as further described herein”) With regard to claim 7, Walther (546) in view of Putz (Database), and further in view AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: an output module either displaying information on virtual environment that is being operated in the simulator or transmitting data. (Putz (Database) Figure 2 data is extracted from the scenario database and transmitted to simulators, and Page 3, Right the database is implemented in PEGASUS (an output module)) It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle simulation system disclosed by Walther (5620) with the autonomous vehicle database information for simulation disclosed by Putz (Database). One of ordinary skill in the art would have been motivated to make this modification in order to desirably control access to autonomous vehicle data (Putz (Database) Page 2, Right virtual tests are less costly than real-world tests). With regard to claim 8, Walther (546) in view of Putz (Database), and further in view AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: wherein the input module receives real-world test drive data of the autonomous vehicle and inputs the real-world test drive data of the autonomous vehicle as the operation environment data of the database through the DBMS, and the simulator receives the autonomous vehicle operation data and the operation environment data from the database. (Walther (546) Col. 4, Lines 16 – 35 tests can be based on collected driving log data (receives real-world test data and inputs as the operation environment data), where the simulation is based on tests from either collected logs or full simulation (simulator receives from the database) “In some implementations, a test can be based on previously collected driving logs that were acquired from one or more autonomous vehicles deployed in the real-world. For such log-based testing, the testing system and/or a user can select driving log data (and/or a section thereof) associated with an event that occurred in the real-world … In some implementations, a test can be based on a ful; simulation.”, and Col. 7, Lines 7 – 14 driving log data contains information about the environment of the autonomous vehicle) With regard to claim 9, Walther (562) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: wherein the plurality of data sets for the autonomous vehicle include (Walther (546) Col. 3, Lines 14 – 17 the autonomous vehicle stored data is based on outputs of a software stack “The autonomy software stack can enable the autonomous vehicle to perceive object(s) within its surrounding environment, predict the motion of those objects, and plan the motion of the autonomous vehicle, accordingly”) a perception data set, a planning data set, and a control data set; (Walther (562) Col. 27, Lines 59 – 63 sensor and perception data (perception data set), and prediction data (planning data set), and motion planning data are stored “The memory 824 can store data 828 that can be obtained and/or stored. The data 828 can include, for instance, sensor data, perception data, prediction data, motion planning data, driving log data, and/or other data/information as described herein”, and Col. 14, Lines 7 – 14 the motion plan is used to directly control the vehicle (control data set) “In such a case, the autonomous vehicle computing system 104 can provide, to the simulation system 102, data indicative of instructions determined by the vehicle controller system based at least in part on the motion plan 216. The simulation system 102 can control the simulated autonomous vehicle based at least in part on the data indicative of the vehicle controller system instructions, as further described herein”) one of the perception data set, the planning data set, and the control data set is modified based on the algorithm from the user; and the autonomous vehicle operation data for the autonomous vehicle retrieved by the DBMS from the database includes the modified data set. (Walther (562) Col. 10, Lines 1 – 7 user can modify a test to include a newer software version (based on the algorithm from the user) “Moreover, the testing system can enable the user to modify and/or re-run the test (e.g., to test a newer software version) by providing user input to the user interface. Accordingly, the testing system can initiate any of the tests to obtain additional test result data”, and Col. 9, Lines 15 – 19 the modified tests by the user produces modified data sets which would be stored and later retrieved (one of data is modified, and the data retrieved includes the modified data set) “For example, while and/or after the test is running, the testing system can obtain feedback data from the autonomous vehicle computing system ( e.g., perception data, prediction data, motion planning data, etc. generated by the autonomy software stack).”) With regard to claim 10, Walther (546) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: wherein the input module is further configured to receive user identification information of another user and (Putz (Database) Page 2, Right user rights are created for individual users (of another user)) another algorithm from the another user; (Walther (546) Col. 10, Lines 1 – 7 any user can desirably modify a test (another) with wholly predictable results) wherein the plurality of data sets for the autonomous vehicle include (Walther (546) Col. 3, Lines 14 – 17 the autonomous vehicle stored data is based on outputs of a software stack “The autonomy software stack can enable the autonomous vehicle to perceive object(s) within its surrounding environment, predict the motion of those objects, and plan the motion of the autonomous vehicle, accordingly”) a perception data set, a planning data set, and a control data set; (Walther (562) Col. 27, Lines 59 – 63 sensor and perception data (perception data set), and prediction data (planning data set), and motion planning data are stored “The memory 824 can store data 828 that can be obtained and/or stored. The data 828 can include, for instance, sensor data, perception data, prediction data, motion planning data, driving log data, and/or other data/information as described herein”, and Col. 14, Lines 7 – 14 the motion plan is used to directly control the vehicle (control data set) “In such a case, the autonomous vehicle computing system 104 can provide, to the simulation system 102, data indicative of instructions determined by the vehicle controller system based at least in part on the motion plan 216. The simulation system 102 can control the simulated autonomous vehicle based at least in part on the data indicative of the vehicle controller system instructions, as further described herein”) one of the perception data set, the planning data set, and the control data set is modified as a first modified data set based on the algorithm from the user; (Walther (562) Col. 10, Lines 1 – 7 user can modify a test to include a newer software version (based on the algorithm from the user) “Moreover, the testing system can enable the user to modify and/or re-run the test (e.g., to test a newer software version) by providing user input to the user interface. Accordingly, the testing system can initiate any of the tests to obtain additional test result data”, and Col. 9, Lines 15 – 19 the modified test by the user produces modified data sets which would be stored and later retrieved (one of data is modified) “For example, while and/or after the test is running, the testing system can obtain feedback data from the autonomous vehicle computing system ( e.g., perception data, prediction data, motion planning data, etc. generated by the autonomy software stack).”) another of the perception data set, the planning data set, and the control data set is modified as a second modified data set based on the another algorithm from the another user; and (Walther (562) Col. 10, Lines 1 – 7 the another algorithm would modify one or more data sets in the same manner as the algorithm from the user with wholly predictable results, where the data set modified could be different) the autonomous vehicle operation data for the autonomous vehicle retrieved by the DBMS from the database includes the first modified data set and the second modified data set. (Walther (546) Col. 9, Lines 15 – 19 the modified tests by the user and another user produce modified data sets which would be stored and later retrieved (includes the first and second modified data sets) “For example, while and/or after the test is running, the testing system can obtain feedback data from the autonomous vehicle computing system ( e.g., perception data, prediction data, motion planning data, etc. generated by the autonomy software stack).)” It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle simulation system disclosed by Walther (5620) with the autonomous vehicle database information for simulation disclosed by Putz (Database). One of ordinary skill in the art would have been motivated to make this modification in order to desirably control access to autonomous vehicle data (Putz (Database) Page 2, Right virtual tests are less costly than real-world tests). With regard to claim 11, Walther (546) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 1, and further teaches: wherein each of the first algorithm, the second algorithm, and the third algorithm is one of an algorithm for perceiving objects by the autonomous vehicle, an algorithm for planning operations of the autonomous vehicle, and algorithm for controlling operations of the autonomous vehicle. (Walther (546) Col. 14, Lines 7 – 14 the data packets are directly related to motion planning (first/second/third algorithm is for planning operations of the autonomous vehicle) “In such a case, the autonomous vehicle computing system 104 can provide, to the simulation system 102, data indicative of instructions determined by the vehicle controller system based at least in part on the motion plan 216. The simulation system 102 can control the simulated autonomous vehicle based at least in part on the data indicative of the vehicle controller system instructions, as further described herein”) Claims 5 – 6 are rejected under 35 U.S.C. 103 as being unpatentable over Walther (546) in view of Putz (Database), and further in view of AWS (IoT), and further in view of Levinson et al. (WO 2017/079229) (henceforth “Levinson (229)”). Walther (562) and Putz (Database) and AWS (IoT) and Levinson (229) are analogous art because they solve the same problem of simulating autonomous vehicle behavior, and since they are in the same field of autonomous vehicles. With regard to claim 5, Walther (562) in view of Putz (Database), and further in view of AWS (IoT) teaches all the elements of the parent claim 1, and does not appear to explicitly disclose: wherein the input module receives data packet information or usage authority request information of the autonomous vehicle from a user terminal, and when the input module receives the data packet information, the DBMS classifies a data set corresponding to the data packet information and inputs the classified data set to the database. However, Levinson (229) teaches: wherein the input module receives data packet information or usage authority request information of the autonomous vehicle from a user terminal, and (Levinson (229) Paragraph 80 and Figure 9 a teleoperator process collection vehicle operation messages in combination with a terminal) when the input module receives the data packet information, the DBMS classifies a data set corresponding to the data packet information and inputs the classified data set to the database. (Levinson (229) Paragraph 110 – 111 and Figure 27 an object tracker classifies tracked blob data and feedback data may be exchanged with a database, and Paragraph 112 resulting object tracker data is stored in a 3D object database) It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle database information for simulation including database access control disclosed by Walther (562) in view of Putz (Database), and further in view of AWS (IoT) with the classifies object blobs and generated track data for a database disclosed by Levinson (299). One of ordinary skill in the art would have been motivated to make this modification in order to track desired objects in the environment (Levinson (299) Paragraph 110) With regard to claim 6, Walther (562) in view of Putz (Database), and further in view of AWS (IoT), and further in view of Levinson (229) teaches all the elements of the parent claim 5, and further teaches: wherein, when data packet information is previously input from a first user terminal and stored in the database and another user terminal requests authorization to use the corresponding data packet, (AWS (IoT) Section “Authorization” policies can control any operation from any terminal (another user terminal requests authorization to use), and the data plane API allows sending and receiving data from any terminal (wherein, when data packet information is previously input from a first user terminal and stored in the database and another user terminal requests authorization to use the corresponding data packet)) the security module transmits a notification on the authority to use the data packet and data packet sharing information to the first user terminal. (AWS (IoT) Section “Monitoring AWS IoT” the logs desirably show the results of a policy action performed and includes the authentication status (the security module transmits a notification on the authority to use the corresponding data packet and data packet sharing information), where the logs could be viewed by any desired user on any terminal (to the first user terminal) “The following example shows a CloudTrail log entry that demonstrates the AttachPolicy action … "UserIdentity":{"type":"AssumedRole", … "accessKeyId":"access-key-id", … "mfaAuthenticated":"false"”) It would have been obvious to one of ordinary skill in the art to combine before the effective filing date of the claimed invention the autonomous vehicle database comprising objects for simulation including database access control disclosed by Walther (562) in view of Putz (Database) with the access controls on objects related to internet of things data disclosed by AWS (IoT). One of ordinary skill in the art would have been motivated to make this modification in order to desirably control access to object data on the cloud (AWS (IoT) Section “Authorization”). Examiner General Comments With regard to the prior art rejection(s), any cited portion of the relied upon reference(s), either to specific areas or as direct language, is intended to be interpreted in the context of the reference(s) as a whole, as would be understood by one of ordinary skill in the art. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. The entire reference is considered to provide disclosure relating to the claimed invention. The claims & only the claims form the metes & bounds of the invention. Office personnel are to give the claims their broadest reasonable interpretation in light of the supporting disclosure. Unclaimed limitations appearing in the specification are not read into the claim. Prior art was referenced using terminology familiar to one of ordinary skill in the art. Such an approach is broad in concept and can be either explicit or implicit in meaning. Examiner's Notes are provided with the cited references to assist the applicant to better understand how the examiner interprets the applied prior art. Such comments are entirely consistent with the intent & spirit of compact prosecution. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALFRED H. WECHSELBERGER whose telephone number is (571)272-8988. The examiner can normally be reached M - F, 10am to 6pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emerson Puente can be reached at 571-272-3652. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALFRED H. WECHSELBERGER/ExaminerArt Unit 2187 /EMERSON C PUENTE/Supervisory Patent Examiner, Art Unit 2187
Read full office action

Prosecution Timeline

Dec 10, 2020
Application Filed
Feb 24, 2024
Non-Final Rejection — §103
May 23, 2024
Response Filed
Oct 05, 2024
Final Rejection — §103
Jan 08, 2025
Request for Continued Examination
Jan 13, 2025
Response after Non-Final Action
Mar 07, 2025
Non-Final Rejection — §103
Aug 01, 2025
Response Filed
Nov 14, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561501
SYSTEM AND METHOD FOR EXCESS GAS UTILIZATION
2y 5m to grant Granted Feb 24, 2026
Patent 12517804
GENERATING TECHNOLOGY ENVIRONMENTS FOR A SOFTWARE APPLICATION
2y 5m to grant Granted Jan 06, 2026
Patent 12468581
INTER-KERNEL DATAFLOW ANALYSIS AND DEADLOCK DETECTION
2y 5m to grant Granted Nov 11, 2025
Patent 12462075
RESOURCE PREDICTION SYSTEM FOR EXECUTING MACHINE LEARNING MODELS
2y 5m to grant Granted Nov 04, 2025
Patent 12450145
ADVANCED SIMULATION MANAGEMENT TOOL FOR A MEDICAL RECORDS SYSTEM
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
58%
Grant Probability
94%
With Interview (+36.5%)
3y 8m
Median Time to Grant
High
PTA Risk
Based on 212 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month