DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1, 3-11 and 13-15 are pending in this office action.
Claims 2, 12 are cancelled.
Response to Arguments
Applicant's arguments filed 10/02/2025 have been fully considered but they are not persuasive.
Applicant’s argument:
The following distinguishing recitation of amended claim 1: "wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information." is neither disclosed nor suggested by the cited prior art Khorsand at al. and Heit et al. This distinguishing recitation has the technical effect that subsequent data able to be obtained from the simulation data thereby includes the so-called ground truth information. When the scenario data is used to test a driver assistance system, for example, one can follow which objects the driver assistance system correctly detected and which it incorrectly detected. Examples of such labels are "tree", "pedestrian", "passenger", "car", "truck", etc. (. 14, 1. 23 to 28 of the English translation of the application).
….
Khorsand at al. is limited to "labeling" AD/ADAS vehicles 210, human-controlled vehicles 220 and the fully computer-controlled vehicles 230 so that a real player can immediately identify the vehicles and expose them to specific traffic situations. Labeling the vehicles with metadata, which thus goes beyond merely labeling them with a "flag", "identifier" or similar, is not mentioned in Khorsand at al. However, this is essential in order to solve the objective technical problem, namely to obtain simulation data with ground truth information.
Examiner response:
The issue in the argument is that neither Khorsand nor Heit discloses labeling the object, but the applicant’s representative recites an example of objects sets.
Khorsand in generating scenario store the parameters associated with in a data store.
[0306] “. Storing of data that allows for re-creation and thus reuse of the scenario may thus be more valuable than data relating to the behavior of the functionality, e.g. testing of the AD/ADAS functionality as such.”;
So, what was stored and collected:
[0045] “The computer controlled and/or human controlled movable virtual objects may e.g. correspond to or comprise vehicle(s), as discussed and illustrated in relation to Figure 2A, and may additionally or alternatively comprise or be other object(s), such as virtual human(s), e.g. pedestrians, walking and/or running, with e.g. stroller(s), flying objects, such as drones, or thrown or falling objects, such as stones or trees, that can be moved by the user and placed e.g. on a road in the virtual environment, etc. As understood, the movable virtual objects may correspond to various kind of road users.”;
[0033]“ Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc. Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
And in Heit what type are the parameters also: first the parameters are stored in storage 104 of fig.1
[0024] “ Storage 104 includes scenario parameter(s) 110”;
This is a set of parameters that are stored:
[0025] “It is understood that scenario parameters 110 may describe a wide variety of scenarios, and may include one or more of: a distance between the autonomous vehicle system and an object entering its path (e.g., a pedestrian entering the vehicle's path at a distance of 500 meters, 400 meters, 250 meters, 100 meters, or 50 meters, among other possibilities), a type of object present in the scenario (e.g., a cyclist, another vehicle, a pedestrian, an animal, debris, etc.), roads and transportation infrastructure generally (e.g., a winding mountain road, a straight flat road, a four way stoplight intersection, a stop sign intersection, a roundabout, a road under construction, bridges, tunnels, and the like), weather and other environmental data (e.g., precipitation, ambient temperature, humidity, cloud cover, time of day, ambient luminosity or strength of sunlight, wind speed, etc.), traffic conditions (e.g., heavy traffic, light traffic, substantially no other vehicles, large number of pedestrians such as in a crowded parking lot, emergency vehicles driving atypically to arrive at an emergency, among other types of traffic conditions), and events to which the autonomous vehicle system must respond (e.g., a collision between two or more vehicles in the autonomous vehicle system's path, another vehicle suddenly stopping in front of the autonomous vehicle system, another vehicle failing to obey a stop sign or other traffic law, etc.).;
Double Patenting
The non-statutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A non-statutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on non-statutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a non-statutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based e-Terminal Disclaimer may be filled out completely online using web-screens. An e-Terminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about e-Terminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-15 are provisionally rejected on the ground of non-statutory double patenting as being unpatentable over claim 1-16 of co-pending Application No. 18548818 in view of Khorsand et al : EP 3745382A1 (From IDS). Although the claims at issue are not identical, they are not patentably distinct from each other .mapping of independent claims 1 is as follow where corresponding limitation have a same cue.
This is a provisional non-statutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Application 18/548,810
Co-pending:18/548,818
a computer-implemented method for generating scenario data for the testing of a driver assistance system of a vehicle: the method comprising: generating simulation data by:
simulating a virtual traffic situation having a plurality of virtual road users, wherein at least one virtual road user of the plurality of virtual road users can be controlled by a first user and any virtual road users of the plurality of virtual road users not able to be controlled by other users are automatically controlled:
outputting a virtual environment of the at least one virtual road user on the basis of the virtual traffic situation to the first user via a first user interface:
capturing inputs of the first user for controlling the at least one virtual road user in the virtual environment of the at least one virtual road user.
wherein the captured inputs of the first user an interaction of the at least one virtual road user with the virtual environment are factored into the simulating of the virtual traffic situation:
checking the generated simulation data for an occurrence of a scenario arising from the interaction of the at least one virtual road user with the virtual environment:
wherein the occurrence of the scenario is characterized by a predefined constellation of simulated measured variables:
extracting scenario data related to the scenario upon occurrence of the scenario being determined:
recording the scenario data for the testing of the driver assistance system, wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information.
1. A computer-implemented method for the testing of a driver assistance system of a vehicle having the following work steps:
simulating a virtual traffic situation comprising the vehicle and at least one further road user, wherein a first road user can be controlled by a first user and wherein further road users not able to be controlled by users are automatically controlled, in particular by artificial intelligence or by logic-based control;
outputting a virtual environment of the at least one first road user on the basis of the virtual traffic situation to the first user via a first, in particular an at least visual, user interface;
capturing inputs of the first user for controlling the at least one first road user in the virtual environment of the first road user via a second user interface,
wherein the captured inputs of the first user and the resulting interaction of the at least one first road user with its virtual environment are factored into the simulating of the virtual traffic situation;
The co-pending application does not explicitly disclose:
checking the generated simulation data for an occurrence of a scenario arising from the interaction of the at least one virtual road user with the virtual environment;
wherein the occurrence of the scenario is characterized by a predefined constellation of simulated measured variables;
extracting scenario data related to the scenario upon occurrence of the scenario being determined: and recording the scenario data for the testing of the driver assistance system, wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information.
Khorsand discloses:
checking the generated simulation data for an occurrence of a scenario arising from the interaction of the at least one virtual road user with the virtual environment:
[0067] “The server 105 may initiate identification, or identify, that a certain scenario has occurred. The identification may be based on data generated in relation to one or more of said at least one virtual AD/ADAS vehicle, e.g. the virtual AD/ADAS vehicle 210”;
wherein the occurrence of the scenario is characterized by a predefined constellation of simulated measured variables:
[0074] “For example, the identification may be based that the software providing the AD/ADAS functionality, when executed on the server 105 or on the device 101, signals that it could not handle a situation as it should or that a problematic event, e.g. collision, has occurred, and/or that some part of software monitors what the involved AD/ADAS vehicle is being caused and/or is causing when interacting with other vehicles, objects and infrastructure in the virtual environment, and generate data identifying certain events, e.g. a collision. The identification may further be based on a predefined criteria that e.g. may comprise occurrence of such certain event and/or occurrence of certain signal from the software providing the AD/ADAS functionality”;
extracting scenario data related to the scenario upon occurrence of the scenario being determined:
[0088] “Note that the present action, or in general storage of scenario data, such as data enabling recreation of at least part of the scenario, may be performed in response to identification that a scenario of interest has occurred and be regarding the identified scenario of interest, i.e. may be in response to Action 304. It may suffice and be beneficial to only store data for recreation of scenarios of interest.”
recording the scenario data for the testing of the driver assistance system:
Step 306 of fig. 3 and [0085]“ A scenario of interest for testing a particular AD/ADAS functionality is typically of interest for testing this and similar or related AD/ADAS functionality also when provided by another software and/or when related to another vehicle. Storing of data that allows for re-creation and thus reuse of the scenario may thus be more valuable than data relating to the behavior of the functionality, e.g. testing of the AD/ADAS functionality as such.”;
wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information:
[0045] “The computer controlled and/or human controlled movable virtual objects may e.g. correspond to or comprise vehicle(s), as discussed and illustrated in relation to Figure 2A, and may additionally or alternatively comprise or be other object(s), such as virtual human(s), e.g. pedestrians, walking and/or running, with e.g. stroller(s), flying objects, such as drones, or thrown or falling objects, such as stones or trees, that can be moved by the user and placed e.g. on a road in the virtual environment, etc. As understood, the movable virtual objects may correspond to various kind of road users.”;
[0033]“ Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc. Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Khorsand into teachings of co-pending application to reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios) .[Heit 0007].
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 13 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 13 is referring back to claim 12, but Chaim 12 is canceled.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-11 and 13-15 are rejected under 35 U.S.C. 103 as being unpatentable over Khorsand et al :EP 3745382A1 (From IDS) in view of Heit et al US20190155291A1.
As per claim 1, Khorsand discloses a computer-implemented method for generating scenario data for the testing of a driver assistance system of a vehicle:
[0007] “According to a first aspect of embodiments herein, the object is achieved by a method, performed by a server, for supporting generation of scenarios for testing autonomous driving and/or advanced driver assistance system, AD/ADAS, functionality for one or more real world vehicles.”;
the method comprising: generating simulation data by:
simulating a virtual traffic situation having a plurality of virtual road users, wherein at least one virtual road user of the plurality of virtual road users can be controlled by a first user and any virtual road users of the plurality of virtual road users not able to be controlled by other users are automatically controlled:
[0007] “In the virtual environment it is operating: one or more fully computer controlled movable virtual objects, one or more human controlled movable virtual objects and at least one virtual AD/ADAS vehicle operating according to said AD/ADAS functionality. The server allows devices to remotely connect to the server and users of said devices to, via user interfaces of the devices, control said human controlled movable virtual objects, respectively, in the virtual environment, and thereby cause generation of scenarios that one or more of said at least one virtual AD/ADAS vehicle are subjected to”;
[0033] “Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
outputting a virtual environment of the at least one virtual road user on the basis of the virtual traffic situation to the first user via a first user interface:
Fig. 3 step 301 and 302 [0050]“The server 105 allows devices, e.g. the devices 101-103, to remotely connect to the server 105 and users of the devices 101-103 to, via user interfaces of the devices 101-103, control said human controlled movable virtual objects 220a-c, respectively, in the virtual environment 200.”;
capturing inputs of the first user for controlling the at least one virtual road user in the virtual environment of the at least one virtual road user.
[0051]“The scenarios above are thus caused by at least one user controlling a human controlled movable virtual object, e.g. any one of 220a-c, during operation. This cause interaction directly between this or these and the virtual AD/ADAS vehicle 210, or indirectly via the computer controlled movable virtual objects 230a-c, another human controlled movable virtual object, such as controlled by another user, and/or the virtual environment 200”;
[0052]“However, the greater amount of and the greater variation among generated scenarios, the greater the chance that scenarios of interest and corner case scenarios are generated. Multiple users on multiple devices, respectively, where the user of each device is controlling one or more human controlled movable virtual object in the virtual environment 200 together with fully computer controlled movable virtual objects also operating therein, will facilitate and support generation of such scenarios”;
wherein the captured inputs of the first user an interaction of the at least one virtual road user with the virtual environment are factored into the simulating of the virtual traffic situation:
[0032]”Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc.
checking the generated simulation data for an occurrence of a scenario arising from the interaction of the at least one virtual road user with the virtual environment:
[0067] “The server 105 may initiate identification, or identify, that a certain scenario has occurred. The identification may be based on data generated in relation to one or more of said at least one virtual AD/ADAS vehicle, e.g. the virtual AD/ADAS vehicle 210”;
wherein the occurrence of the scenario is characterized by a predefined constellation of simulated measured variables:
[0074] “For example, the identification may be based that the software providing the AD/ADAS functionality, when executed on the server 105 or on the device 101, signals that it could not handle a situation as it should or that a problematic event, e.g. collision, has occurred, and/or that some part of software monitors what the involved AD/ADAS vehicle is being caused and/or is causing when interacting with other vehicles, objects and infrastructure in the virtual environment, and generate data identifying certain events, e.g. a collision. The identification may further be based on a predefined criteria that e.g. may comprise occurrence of such certain event and/or occurrence of certain signal from the software providing the AD/ADAS functionality”;
extracting scenario data related to the scenario upon occurrence of the scenario being determined:
[0088] “Note that the present action, or in general storage of scenario data, such as data enabling recreation of at least part of the scenario, may be performed in response to identification that a scenario of interest has occurred and be regarding the identified scenario of interest, i.e. may be in response to Action 304. It may suffice and be beneficial to only store data for recreation of scenarios of interest.”
recording the scenario data for the testing of the driver assistance system:
Step 306 of fig. 3 and [0085]“ A scenario of interest for testing a particular AD/ADAS functionality is typically of interest for testing this and similar or related AD/ADAS functionality also when provided by another software and/or when related to another vehicle. Storing of data that allows for re-creation and thus reuse of the scenario may thus be more valuable than data relating to the behavior of the functionality, e.g. testing of the AD/ADAS functionality as such.”;
wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information:
[0045] “The computer controlled and/or human controlled movable virtual objects may e.g. correspond to or comprise vehicle(s), as discussed and illustrated in relation to Figure 2A, and may additionally or alternatively comprise or be other object(s), such as virtual human(s), e.g. pedestrians, walking and/or running, with e.g. stroller(s), flying objects, such as drones, or thrown or falling objects, such as stones or trees, that can be moved by the user and placed e.g. on a road in the virtual environment, etc. As understood, the movable virtual objects may correspond to various kind of road users.”;
[0033]“ Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc. Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
But not explicitly:
Capturing via a second user interface:
Heit discloses:
Capturing via a second user interface :
[0073] The vehicle control system 500 can also include a controller 520 capable of controlling one or more aspects of vehicle operation based on automated driving commands received from the processor. In some examples, the vehicle control system 500 can be connected to (e.g., via controller 520) one or more actuator systems 530 in the vehicle and one or more indicator systems 540 in the vehicle.
[0074] “an input component configured to receive the one or more scenario parameters from user input. Additionally, or alternatively to one or more of the examples disclosed above, in some examples, determining the one or more scenario parameters includes retrieving one or more scenario parameters from past simulations stored in the memory”;
Heit also discloses:
wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information:
[0025] “It is understood that scenario parameters 110 may describe a wide variety of scenarios, and may include one or more of: a distance between the autonomous vehicle system and an object entering its path (e.g., a pedestrian entering the vehicle's path at a distance of 500 meters, 400 meters, 250 meters, 100 meters, or 50 meters, among other possibilities), a type of object present in the scenario (e.g., a cyclist, another vehicle, a pedestrian, an animal, debris, etc.), roads and transportation infrastructure generally (e.g., a winding mountain road, a straight flat road, a four way stoplight intersection, a stop sign intersection, a roundabout, a road under construction, bridges“;
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Heit into teachings of Khorsand reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios). [Heit 0007].
As per claim 3, the rejection of claim 1 is incorporated and furthermore Khorsand discloses:
wherein the scenario data is described during extraction such that it can be used to simulate scenarios:
[0117] “During operation in the virtual environment, e.g. during action 405, the server 105 and/or device(s), e.g. 101, may store data, which may be referred to as data logging. The data may relate to behavior of the virtual AD/ADAS vehicle(s), e.g. the virtual AD/ADAS vehicle 210, and/or to data that enables generated scenario(s) to be partly or fully re-created.”;
As per claim 4, the rejection of claim 1 is incorporated and furthermore Khorsand discloses:
wherein the at least one virtual road user is prompted to perform activity by at least one action in the virtual environment:
[0036] “Users of the devices 101-103 may e.g. via user interfaces of the devices, control the human controlled movable virtual objects 220a-c in the virtual environment 200 and thereby make one or more of these interact with and/or affect the virtual environment 200 as such, e.g. the infrastructure thereof, the computer controlled virtual objects 230a-c and/or the virtual AD/ADAS vehicle 210 and/or other human controlled movable virtual object(s).
As per claim 5, the rejection of claim 1 is incorporated and furthermore Khorsand does not explicitly disclose:
determining a quality factor for the extracted scenario data as a function of a predefined criterion, wherein the quality factor is characterized by a level of dangerousness of the scenario.
Heit discloses:
determining a quality factor for the extracted scenario data as a function of a predefined criterion, wherein the quality factor is characterized by a level of dangerousness of the scenario:
[0030]“ For example, scenario parameters 110 may be incrementally changed to describe scenarios with distances between the autonomous vehicle system and the stopped traffic that decrease to 250 meters, 200 meters, 100 meters, and so on with each scenario. The performance metrics 112 of the scenarios may indicate corresponding decreases in the quality of the autonomous vehicle's attempt to stop, such as a decrease from over ten meters between the vehicles following the stop in the first scenario to less than 2 meters between the vehicles following the stop in the scenario with a distance of 100 meters.
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Heit into teachings of Khorsand reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios) .[Heit 0007].
As per claim 6, the rejection of claim 5 is incorporated and furthermore Khorsand discloses:
wherein the quality factor is higher the more dangerous the scenario is;
[0032] “In some examples, therefore, minimum performance scenarios 116 may be determined based on scenarios with performance metrics 112 that are larger than those of other scenarios, such as where performance metrics 112 measure the autonomous vehicle system's stopping distance or its delays when responding to its surroundings (e.g., in some examples, a high performance metric can indicate undesirable performance, such as long braking distances, or long response delays).”;
As per claim 7, the rejection of claim 5 is incorporated and furthermore Khorsand discloses:
wherein the first user is credited with a reward, as a function of the quality factor of a resultant the scenario:
[0082]“Feedback like this to the user support further similar behavior by the user, which behaviour e.g. resulted in an identified corner case scenario. It also enables gamification regarding the scenario generation, where the user in association with the feedback can be rewarded. There may e.g. be a rating and/or points rewarded to users and that relate to the number of identified certain scenarios, e.g. scenarios identified as being of interest, that the user has been involved in generating”;
As per claim 8, the rejection of claim 1 is incorporated and furthermore Khorsand discloses:
wherein a traffic flow model, is used to simulate the virtual traffic situation:
[0121] “The user mentioned above may e.g. be or correspond to a crowdsourced user that may act as tester and e.g. through actions 401-404 may have possibility to set up or opt, plan, or at least participate in a traffic situation or traffic scenario that will or may result in a scenario during operation in the virtual environment in action 405. The scenario may thereafter be identified and related data be stored as in actions 406-407.
As per claim 9, the rejection of claim 1 is incorporated and furthermore Khorsand discloses:
A computer-implemented method for testing the driver assistance system of the vehicle using the scenario data generated via the method according to claim 1:
[0124] “Note that although embodiments herein may be particularly suitable to produce e.g. corner case scenarios that may be re-created and reused in test cases, they may also be used for testing a particular AD/ADAS functionality as such, in addition to e.g. conventional real world test driving and testing in an all computer controlled and simulated context..”;
But not explicitly:
providing the scenario data characterizing the scenario in which the vehicle is situated and which has a plurality of other virtual road users,
simulating the virtual environment of the vehicle from the provided scenario data; outputting the virtual environment to the driver assistance system via the first user interface;
and operating the driver assistance system in the virtual environment of the first vehicle.
Heit discloses:
providing the scenario data characterizing the scenario in which the vehicle is situated and which has a plurality of other virtual road users:
[0049] “As a single specific example, scenario parameters 110 may describe a scenario in which a cyclist enters the path of the autonomously operating vehicle while it travels at a speed of 45 MPH and with a distance of 100 meters between the cyclist and the autonomous vehicle system.
simulating the virtual environment of the vehicle from the provided scenario data:
[0051] At 308, system 100 simulates a scenario of the autonomous vehicle system, according to scenario parameters 110 and vehicle parameters 114, as described in greater detail above with reference to FIG. 1. Also as described in greater detail above with reference to FIG. 1, one or more processors, servers or simulation systems may perform the simulation.
outputting the virtual environment to the driver assistance system via the first user interface; and operating the driver assistance system in the virtual environment of the first vehicle:
[0038] “In some examples, simulator 126 and simulation controller 128 may simulate the various scenarios according to scenario parameters 110 and vehicle parameters 114 as determined by scenario analyzer 120 and vehicle analyzer 124. The simulator 126 may utilize contents of storage 104 (e.g., scenario and vehicle parameters 110, 114, performance metrics 112, minimum performance scenarios 116 and/or instructions 118) to simulate the autonomous vehicle system and its environment (e.g., other vehicles, pedestrians, cyclists, the weather, and other scenario parameters 110 as described above) to behave in a manner that is substantially identical to their real-world counterparts. Certain examples of system 100 may present a simulation performed by simulator 126 in a visual format on display 106”;
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Heit into teachings of Khorsand reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios) .[Heit 0007].
As per claim 10, the rejection of claim 9 is incorporated and furthermore Khorsand discloses:
wherein the driver assistance system is simulated:
[0010] “The server is configured to provide a virtual environment simulating an environment relevant for operation of one or more vehicles having said AD/ADAS functionality.;
As per claim 11, the rejection of claim 10 is incorporated and furthermore Khorsand discloses:
wherein data relating to the virtual environment of the vehicle is fed into the driver assistance system during the operation of the driver assistance system and/or the driver assistance system;
[0038] “A user of a device, e.g. the device 101, may further load and/or affect behaviour of computer controlled movable virtual objects, e.g. 230a-c, such as Al road users, in the simulator and/or computer game, e.g. in order to set up a scenario where the goal is to expose the virtual AD/ADAS vehicle 210 to a situation or scenario of interest, e.g. corresponding to a corner case scenario.”;
As per claim 13, the rejection of claim 12 is incorporated and furthermore Khorsand discloses:
A non-transitory, computer-readable medium on which a computer program according to claim 12 is stored.
[0125]” The computer program 503 may thus be stored on the computer readable storage medium 601”;
As per claim 14, Khorsand discloses a system comprising:
a processor; and a memory storing therein instructions which cause the processor to generate scenario data for testing a driver assistance system of a vehicle:
[0127]“The server 500 may comprise a processing module 501, such as a means, one or more hardware modules, including e.g. one or more processors, and/or one or more software modules for performing said method and/or actions”;
[0007] “According to a first aspect of embodiments herein, the object is achieved by a method, performed by a server, for supporting generation of scenarios for testing autonomous driving and/or advanced driver assistance system, AD/ADAS, functionality for one or more real world vehicles.”;
simulating a virtual traffic situation having a plurality of virtual road users, wherein at least one virtual road user of the plurality of virtual road users can be controlled by a first user, wherein any virtual road users of the plurality of virtual road users not able to be controlled by other users are automatically controlled:
[0007] “In the virtual environment it is operating: one or more fully computer controlled movable virtual objects, one or more human controlled movable virtual objects and at least one virtual AD/ADAS vehicle operating according to said AD/ADAS functionality. The server allows devices to remotely connect to the server and users of said devices to, via user interfaces of the devices, control said human controlled movable virtual objects, respectively, in the virtual environment, and thereby cause generation of scenarios that one or more of said at least one virtual AD/ADAS vehicle are subjected to”;
[0033] “Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
wherein simulation data is generated during the simulation:
[0117] “During operation in the virtual environment, e.g. during action 405, the server 105 and/or device(s), e.g. 101, may store data, which may be referred to as data logging. The data may relate to behavior of the virtual AD/ADAS vehicle(s), e.g. the virtual AD/ADAS vehicle 210, and/or to data that enables generated scenario(s) to be partly or fully re-created.’
outputting ,through a first, user interface a virtual environment of at least one virtual road user to the first user on the basis of the virtual traffic situation:
Fig. 3 step 301 and 302 [0050]“The server 105 allows devices, e.g. the devices 101-103, to remotely connect to the server 105 and users of the devices 101-103 to, via user interfaces of the devices 101-103, control said human controlled movable virtual objects 220a-c, respectively, in the virtual environment 200.”;
Capturing inputs of the first user for controlling the at least one virtual road user in the virtual environment of the at least one virtual road user:
[0051]“The scenarios above are thus caused by at least one user controlling a human controlled movable virtual object, e.g. any one of 220a-c, during operation. This cause interaction directly between this or these and the virtual AD/ADAS vehicle 210, or indirectly via the computer controlled movable virtual objects 230a-c, another human controlled movable virtual object, such as controlled by another user, and/or the virtual environment 200”;
[0052]“However, the greater amount of and the greater variation among generated scenarios, the greater the chance that scenarios of interest and corner case scenarios are generated. Multiple users on multiple devices, respectively, where the user of each device is controlling one or more human controlled movable virtual object in the virtual environment 200 together with fully computer controlled movable virtual objects also operating therein, will facilitate and support generation of such scenarios”;
wherein the simulating means is further configured to factor the captured inputs of the first user and an interaction of the at least one virtual road user with the virtual environment into the simulating of the virtual traffic situation:
[0032]”Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc.
checking the generated simulation data for an occurrence of a scenario arising from the interaction of the at least one virtual road user with the virtual environment:
[0067] “The server 105 may initiate identification, or identify, that a certain scenario has occurred. The identification may be based on data generated in relation to one or more of said at least one virtual AD/ADAS vehicle, e.g. the virtual AD/ADAS vehicle 210”;
wherein the occurrence of the scenario is characterized by a predefined constellation of simulated measured variables:
[0074] “For example, the identification may be based that the software providing the AD/ADAS functionality, when executed on the server 105 or on the device 101, signals that it could not handle a situation as it should or that a problematic event, e.g. collision, has occurred, and/or that some part of software monitors what the involved AD/ADAS vehicle is being caused and/or is causing when interacting with other vehicles, objects and infrastructure in the virtual environment, and generate data identifying certain events, e.g. a collision. The identification may further be based on a predefined criteria that e.g. may comprise occurrence of such certain event and/or occurrence of certain signal from the software providing the AD/ADAS functionality”;
extracting scenario data related to the scenario upon occurrence of the scenario being determined by the means for checking the generated simulation data:
[0088] “Note that the present action, or in general storage of scenario data, such as data enabling recreation of at least part of the scenario, may be performed in response to identification that a scenario of interest has occurred and be regarding the identified scenario of interest, i.e. may be in response to Action 304. It may suffice and be beneficial to only store data for recreation of scenarios of interest.”
Recording, in a data storage, the scenario data for testing the driver assistance system:
Step 306 of fig. 3 and [0085]“ A scenario of interest for testing a particular AD/ADAS functionality is typically of interest for testing this and similar or related AD/ADAS functionality also when provided by another software and/or when related to another vehicle. Storing of data that allows for re-creation and thus reuse of the scenario may thus be more valuable than data relating to the behavior of the functionality, e.g. testing of the AD/ADAS functionality as such.”;
wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information:
[0045] “The computer controlled and/or human controlled movable virtual objects may e.g. correspond to or comprise vehicle(s), as discussed and illustrated in relation to Figure 2A, and may additionally or alternatively comprise or be other object(s), such as virtual human(s), e.g. pedestrians, walking and/or running, with e.g. stroller(s), flying objects, such as drones, or thrown or falling objects, such as stones or trees, that can be moved by the user and placed e.g. on a road in the virtual environment, etc. As understood, the movable virtual objects may correspond to various kind of road users.”;
[0033]“ Figure 2a schematically illustrates an example of such virtual environment mentioned above, here a virtual environment 200. As exemplified, the virtual environment 200 contains virtual infrastructure comprising roads for driving of virtual vehicles, e.g. virtual cars, and associated adjoining and nearby structures to the roads, e.g. in the form of buildings, cross-walks, sidewalks, traffic lights, etc. Figure 2b illustrates the virtual environment 200 of Figure 2a, now also populated with computer controlled movable virtual objects 230a-c, that may be exemplified as artificial intelligence (Al) controlled cars, a virtual AD/ADAS vehicle 210 that may be exemplified as an AV car, configured to operate in accordance with said AD/ADAS functionality and human controlled movable virtual objects 220a-c that may be exemplified as a human controlled, i.e. human driver (HD), cars. The objects and vehicles 210, 220a-c, 230a-c are operating and/or configured to operate in the virtual environment 200.”;
But not explicitly:
Capturing via a second user interface:
Heit discloses:
Capturing via a second user interface :
[0073] The vehicle control system 500 can also include a controller 520 capable of controlling one or more aspects of vehicle operation based on automated driving commands received from the processor. In some examples, the vehicle control system 500 can be connected to (e.g., via controller 520) one or more actuator systems 530 in the vehicle and one or more indicator systems 540 in the vehicle.
[0074] “an input component configured to receive the one or more scenario parameters from user input. Additionally, or alternatively to one or more of the examples disclosed above, in some examples, determining the one or more scenario parameters includes retrieving one or more scenario parameters from past simulations stored in the memory”;
Heit also discloses:
wherein during the simulation static and dynamic objects, which are part of the virtual traffic situation, are marked with meta information:
[0025] “It is understood that scenario parameters 110 may describe a wide variety of scenarios, and may include one or more of: a distance between the autonomous vehicle system and an object entering its path (e.g., a pedestrian entering the vehicle's path at a distance of 500 meters, 400 meters, 250 meters, 100 meters, or 50 meters, among other possibilities), a type of object present in the scenario (e.g., a cyclist, another vehicle, a pedestrian, an animal, debris, etc.), roads and transportation infrastructure generally (e.g., a winding mountain road, a straight flat road, a four way stoplight intersection, a stop sign intersection, a roundabout, a road under construction, bridges“;
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Heit into teachings of Khorsand reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios) .[Heit 0007].
As per claim 15, the rejection of claim 1 is incorporated and furthermore Khorsand discloses:
system for testing the driver assistance system of a the vehicle using the scenario data generated via the method according to claim 1:
[0126] “Hence, the server 500 is for supporting generation of scenarios for testing autonomous driving and/or advanced driver assistance system, AD/ADAS, functionality for one or more real world vehicles
But not explicitly:
comprising: a data storage providing the scenario data characterizing the scenario in which the first vehicle is situated and which has a plurality of other virtual road users
simulating the virtual environment of the vehicle based on the scenario data;
outputting the virtual environment to the driver assistance system such that the driver assistance system can be operated in the virtual environment of the vehicle on the basis of the scenario.
Heit discloses:
comprising: a data storage providing the scenario data characterizing the scenario in which the first vehicle is situated and which has a plurality of other virtual road users:
[0049] “As a single specific example, scenario parameters 110 may describe a scenario in which a cyclist enters the path of the autonomously operating vehicle while it travels at a speed of 45 MPH and with a distance of 100 meters between the cyclist and the autonomous vehicle system.
simulating the virtual environment of the vehicle based on the scenario data:
[0051] At 308, system 100 simulates a scenario of the autonomous vehicle system, according to scenario parameters 110 and vehicle parameters 114, as described in greater detail above with reference to FIG. 1. Also as described in greater detail above with reference to FIG. 1, one or more processors, servers or simulation systems may perform the simulation.
outputting the virtual environment to the driver assistance system such that the driver assistance system can be operated in the virtual environment of the vehicle on the basis of the scenario.
[0038]“In some examples, simulator 126 and simulation controller 128 may simulate the various scenarios according to scenario parameters 110 and vehicle parameters 114 as determined by scenario analyzer 120 and vehicle analyzer 124. The simulator 126 may utilize contents of storage 104 (e.g., scenario and vehicle parameters 110, 114, performance metrics 112, minimum performance scenarios 116 and/or instructions 118) to simulate the autonomous vehicle system and its environment (e.g., other vehicles, pedestrians, cyclists, the weather, and other scenario parameters 110 as described above) to behave in a manner that is substantially identical to their real-world counterparts. Certain examples of system 100 may present a simulation performed by simulator 126 in a visual format on display 106”;
It would have obvious to one having ordinary skill in the art before the effective filling date of the claimed invention to combine the teachings of cited references. One of ordinary skill in the art before the effective filling date of the claimed invention would have been motivated to incorporate the teachings of Heit into teachings of Khorsand reduce the number of resources required to test an autonomously operating vehicle's reliability by triggering system faults before failures are observed in real-world operation. Vehicle performance can be evaluated, or quality of the autonomous vehicle system can be tested, by determining a performance metric for each scenario (e.g., a quantitative indication of the performance of one or more aspects of the autonomous vehicle, such as the performance of vehicle sensors, the performance of vehicle brakes, etc. in the various scenarios) .[Heit 0007].
Pertinent arts:
US20190179738A1:
An example method for simulation testing an autonomy software is provided. The example method may include receiving, at processing circuitry, mission parameters indicative of a test mission, environmental parameters, and vehicle parameters.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRAHIM BOURZIK whose telephone number is (571)270-7155. The examiner can normally be reached Monday-Friday (8-4:30).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wei Y Mui can be reached at 571-270-2738. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRAHIM BOURZIK/Examiner, Art Unit 2191
/WEI Y MUI/Supervisory Patent Examiner, Art Unit 2191