Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Drawings
The drawings were received on August 27th 2024. These drawings are accepted.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d).
The certified copy has been filed on October 13th 2024.
Specification
The specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware of, in the specification.
Status of Claims
This Non-Final rejection is in response to the applicant’s filing on October 2nd 2024;
Claims 1-20 are pending and examined below.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1, 3, 10-13 and 15-16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 2 and 15 include limitation; " basic traffic graph- examiner recognizes that the “basic traffic graph” represents traffic density at different time of day.
Claims 10-13 includes the limitation; " basic probability"- examiner recognizes that this represents standard probability.
Further the term “basic” is indefinite because the specification does not clearly redefine the term.
Claims 3 and 16 are rejected under 35 U.S.C. 112(b) as they depends on rejected claims 2 and 15.
For the purpose of examination claims 2, 15 and 10-13 are interpreted based on examiner’s best understanding.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Analysis for claim 1:
In January, 2019 (updated October 2019), the USPTO released new examination guidelines setting forth a two-step inquiry for determining whether a claim is directed to non-statutory subject matter. According to the guidelines, a claim is directed to non-statutory subject matter if:
STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), or
STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis:
STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon?
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application?
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception?
Using the two-step inquiry, it is clear that claim 1 is directed toward non-statutory subject matter, as shown below:
STEP 1: Does claim 1 fall within one of the statutory categories? Yes. The claim is directed mental process which falls within one of the statutory categories.
STEP 2A (PRONG 1): Is the claim directed to a law of nature, a natural phenomenon or an abstract idea? Yes, the claim is directed to an abstract idea.
With regard to STEP 2A (PRONG 1), the guidelines provide three groupings of subject matter that are considered abstract ideas:
Mathematical concepts – mathematical relationships, mathematical formulas or equations, mathematical calculations;
Certain methods of organizing human activity – fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and
Mental processes – concepts that are practicably performed in the human mind (including an observation, evaluation, judgment, opinion).
Claim 1
A data processing method, performed by a computer device running a driving simulation system, the method comprising:
determining, in the driving simulation system, whether a simulated ramp connected to a simulated main road belongs to a sensing region with sensed data;
generating a first virtual simulated vehicle in the simulated ramp at a simulation starting moment in response to that the simulated ramp does not belong to the sensing region with the sensed data;
controlling, in a simulation reproduction stage after the simulation starting moment, a driving behavior of at least one second virtual simulated vehicle traveling in the simulated ramp according to an autonomous driving model corresponding to the simulated ramp, to obtain a traffic status of the simulated ramp, the at least one second virtual simulated vehicle traveling in the simulated ramp comprising the first virtual simulated vehicle; and
controlling, in a simulation prediction stage after the simulation reproduction stage, based on the traffic status of the simulated ramp obtained in the simulation reproduction stage, a driving behavior of a third virtual simulated vehicle traveling in the simulated ramp according to the autonomous driving model corresponding to the simulated ramp, to obtain a predicted traffic status of the simulated ramp, the predicted traffic status being configured to control a traveling state of a physical vehicle traveling on a physical road corresponding to the simulated ramp.
The method in claim 1 is a mental process that can be practicably performed in the human mind and, therefore, an abstract idea. Specifically, the limitations of claim 1 “the simulated ramp does not belong to the sensing region” merely consist of determining if the ramp is on the vehicle path and the status of the traffic which can be done with pen and paper using collected data, the limitation “controlling…a driving behavior” is a merely simulating a driving behavior, which can be done in head as a scenario to the action taken while driving (ie. Stopping the vehicle in the middle of the ramp will create a stoppage of the traffic to the entrance of the ramp). These steps are merely to determine the traffic status and driving behavior based on simulation and data collection without a physical controlling of the vehicle. More specifically, a person can determine the traffic status and driving behavior in different situation based on historical data (previous travel/experience) for the ramp. Thus, the claims recite a mental process.
STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? No, the claim does not recite additional elements that integrate the judicial exception into a practical application.
With regard to STEP 2A (prong 2), whether the claim recites additional elements that integrate the judicial exception into a practical application, the guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application:
an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field;
an additional element that applies or uses a judicial exception to affect a particular treatment or prophylaxis for a disease or medical condition;
an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim;
an additional element effects a transformation or reduction of a particular article to a different state or thing; and
an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception.
While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application:
an additional element merely recites the words “apply it” (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea;
an additional element adds insignificant extra-solution activity to the judicial exception; and
an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use.
Claim 1
A data processing method, performed by a computer device running a driving simulation system, the method comprising:
determining, in the driving simulation system, whether a simulated ramp connected to a simulated main road belongs to a sensing region with sensed data;
generating a first virtual simulated vehicle in the simulated ramp at a simulation starting moment in response to that the simulated ramp does not belong to the sensing region with the sensed data;
controlling, in a simulation reproduction stage after the simulation starting moment, a driving behavior of at least one second virtual simulated vehicle traveling in the simulated ramp according to an autonomous driving model corresponding to the simulated ramp, to obtain a traffic status of the simulated ramp, the at least one second virtual simulated vehicle traveling in the simulated ramp comprising the first virtual simulated vehicle; and
controlling, in a simulation prediction stage after the simulation reproduction stage, based on the traffic status of the simulated ramp obtained in the simulation reproduction stage, a driving behavior of a third virtual simulated vehicle traveling in the simulated ramp according to the autonomous driving model corresponding to the simulated ramp, to obtain a predicted traffic status of the simulated ramp, the predicted traffic status being configured to control a traveling state of a physical vehicle traveling on a physical road corresponding to the simulated ramp.
Claim 1 does not recite any of the exemplary considerations that are indicative of an
abstract idea having been integrated into a practical application. The additional element of “obtaining…to control.” It is only a virtual simulation, data collection to control a vehicle without an actual controlling of the physical vehicle. Accordingly, this additional element does not integrate the
abstract idea into a practical application because it does not impose any meaningful limits on
practicing the abstract idea. The claim is directed to the abstract idea. As such, the additional elements do not integrate the abstract idea into practical application.
STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? No, the claim does not recite additional elements that amount to significantly more than the judicial exception.
With regard to STEP 2B, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre-guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements:
adds a specific limitation or combination of limitations that are not well-understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or
simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present.
Claim 1 does not recite any specific limitation or combination of limitations that are not well-understood, routine, conventional (WURC) activity in the field. Further, applicant’s specification does not provide any indication that the acquiring steps are performing using anything other than a conventional computer. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and Electric Power Group, LLC v. Alstom S.A., 830 F.3d 1350, 1354-55, 119 USPQ2d 1739, 1742 (Fed. Cir. 2016); (Fed. Cir. 2015) or indicate that mere performance of an action is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here).
CONCLUSION
Thus, since claim 1 is: (a) directed toward an abstract idea, (b) does not recite additional
elements that integrate the judicial exception into a practical application, and (c) does not
recite additional elements that amount to significantly more than the judicial exception, it is
clear that claim 1 is directed towards non-statutory subject matter.
Dependent claims 2-13 further limit the abstract idea without integrating the abstract idea into practical application or adding significantly more. For example, the limitations of claims 2 and 3 are further limitations that, under their broadest reasonable interpretation, covers
performance of the limitation in the mind using a similar analysis to claim 1 above.
With respect to independent claims 14 and 20, please see the rejection above with respect to claim 1 which is commensurate in scope to claim 14 and 20 with claim 1 being drawn to a method and claim 14 being drawn to a corresponding device and claim 20 to a non-transitory computer-readable storage medium.
Dependent claims 15-19 further limit the abstract idea without integrating the abstract idea into practical application or adding significantly more. For example, the limitations of claims 2 and 3 are further limitations that, under their broadest reasonable interpretation, covers
performance of the limitation in the mind using a similar analysis to claim 1 above.
As such, claims 1-20 are rejected under 35 USC 101 as being drawn to an abstract idea without significantly more, and thus are ineligible.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 5-12, 14 and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wang (Patent No. US20200247412 A1) in view of Tong (Patent No. CN113674546A).
Regarding claim 1 Wang teaches a data processing method performed by a computer device running a driving simulation system; (See Wang paragraph 0055 and 0060; “…the virtualized driver assistance application 120 is computer logic executable to provide an augmented driver assistance experience simplifies the complexities of vehicle interaction... FIG. 3 is a flowchart of an example method 300 for providing virtualized driver assistance to a driver. In some embodiments, the virtualized driver assistance application 120 may perform the method 300 to assist the driver of a merging vehicle to smoothly merge from the second lane to the first lane in the merging zone…”); the method comprising: determining, in the driving simulation system, whether a simulated ramp connected to a simulated main road belongs to a sensing region with sensed data; (See Wang paragraph 0093 and 0094; “FIG. 11A illustrates an example field of view of a driver of a merging vehicle with augmented reality assistance information incorporated into the forward field of view. As depicted in FIG. 11A, the field of view 1100 may be a front field of view reflecting the area of the roadway environment that the driver observes when looking ahead. The field of view 1100 may be reflected on a front display surface 1120 covering a portion of the windshield of the merging vehicle that is within the front field of view. In this example, the merging vehicle may travel in the lane 1122 on the entrance ramp segment while the reference vehicle 1102 corresponding to the merging vehicle may travel in the ramp adjacent lane on the freeway segment. As depicted in FIG. 11A, the virtual assistance information renderer 208 may render a virtual target 1104 illustrating the reference vehicle 1102 on the front display surface 1120, thereby overlaying the virtual target 1104 in the field of view 1100. Thus, as the driver of the merging vehicle looks ahead through the windshield, the driver of the merging vehicle can see the virtual target 1104 illustrating the reference vehicle 1102 in the field of view 1100 as if the reference vehicle 1102 is traveling in front of the merging vehicle in the lane 1122 at the simulated position of the reference vehicle 1102. Therefore, the driver of the merging vehicle may adjust the vehicle movement of the merging vehicle according to the simulated position of the reference vehicle 1102 indicated by the virtual target 1104. Referring back to FIG. 5, in block 506, the merging plan processor 204 may optionally determine a position range for positioning the merging vehicle in the second lane based on the simulated position of the reference vehicle in the second lane indicated by the virtual target. In some embodiments, the position range may include a minimum (min) region, a safe region, and a max region… ”); generating a first virtual simulated vehicle in the simulated ramp at a simulation starting moment in response to that the simulated ramp does not belong to the sensing region with the sensed data; (See Wang paragraph 0082-0083; “…, the freeway vehicles F1022 and F1022 are in the ramp adjacent lane 1006 while their reference vehicles M1010 and M1014 are in the freeway entrance lane 1008. As depicted in FIG. 10A, the simulated positions of the reference vehicle M1020 and M1014 in the freeway entrance lane 1008 are illustrated with the virtual representations V1030 and V1034. FIG. 10B illustrates another merging situation in a merging zone 1050 in which the entrance ramp segment 1054 includes multiple lanes that eventually merge into a single lane. In this example, the reference vehicle processor 202 may determine the reference vehicle corresponding to the freeway vehicles F1072, F1076, F1078 to be the freeway vehicle F1070 and the merging vehicles M1060, M1066, and determine the reference vehicle for the merging vehicles M1060, M1062, M1064, M1066 to be the freeway vehicles F1072, F1076 and the merging vehicles M1062, M1064 as indicated by the arrows 1061, 1063, 1065, 1067, 1069, 1071, 1073 in FIG. 10B. In this example, the merging vehicle M1066 is in the lane 1059 while its reference vehicle M1064 is in the lane 1058 of the entrance ramp segment 1054. As depicted in FIG. 10B, the simulated position of the reference vehicle M1064 in the lane 1059 is illustrated with the virtual representation V1084. Similarly, the freeway vehicle F1078 is in the lane 1056 of the freeway segment 1052 while its reference vehicle M1066 is in the lane 1059 of the entrance ramp segment 1054. As depicted in FIG. 10B, the simulated position of the reference vehicle M1066 in the lane 1056 is illustrated with the virtual representation V1086..”); controlling, in a simulation reproduction stage after the simulation starting moment, a driving behavior of at least one second virtual simulated vehicle traveling in the simulated ramp according to an autonomous driving model corresponding to the simulated ramp; (See Wang paragraph 0092-0093; “In block 706, the virtual assistance information renderer 208 may render the virtual target at the display position in the field of view of the driver of the merging vehicle. For example, the virtual assistance information renderer 208 may overlay the virtual target having the display size and the appearance features at the display position on the transparent display of the display device that reflects a portion of the field of view. As discussed above, the display position and/or the display size of the virtual target may be determined based on the simulated position of the reference vehicle corresponding to the merging vehicle and the roadway attributes of the second lane. Thus, the virtual target may illustrate the reference vehicle in the field of view of the driver of the merging vehicle as if the reference vehicle is traveling in the second lane and observed by the driver of the merging vehicle, while the reference vehicle is in fact physically traveling in the first lane. As the virtual target may indicate the simulated position of the reference vehicle ahead of the merging vehicle in the second lane, the driver of the merging vehicle may adjust the vehicle movement of the merging vehicle (e.g., the vehicle speed, the acceleration/deceleration rate, etc.) according to the simulated position of the reference vehicle indicated by the virtual target even before the driver of the merging vehicle can see the reference vehicle traveling in the first lane. As a result, when the merging vehicle reaches the merging point consecutively subsequent to the reference vehicle, the merging vehicle is ready to smoothly merge with the reference vehicle in the first lane due to the prior vehicle adjustment. FIG. 11A illustrates an example field of view of a driver of a merging vehicle with augmented reality assistance information incorporated into the forward field of view. As depicted in FIG. 11A, the field of view 1100 may be a front field of view reflecting the area of the roadway environment that the driver observes when looking ahead. The field of view 1100 may be reflected on a front display surface 1120 covering a portion of the windshield of the merging vehicle that is within the front field of view. In this example, the merging vehicle may travel in the lane 1122 on the entrance ramp segment while the reference vehicle 1102 corresponding to the merging vehicle may travel in the ramp adjacent lane on the freeway segment. As depicted in FIG. 11A, the virtual assistance information renderer 208 may render a virtual target 1104 illustrating the reference vehicle 1102 on the front display surface 1120, thereby overlaying the virtual target 1104 in the field of view 1100. Thus, as the driver of the merging vehicle looks ahead through the windshield, the driver of the merging vehicle can see the virtual target 1104 illustrating the reference vehicle 1102 in the field of view 1100 as if the reference vehicle 1102 is traveling in front of the merging vehicle in the lane 1122 at the simulated position of the reference vehicle 1102. Therefore, the driver of the merging vehicle may adjust the vehicle movement of the merging vehicle according to the simulated position of the reference vehicle 1102 indicated by the virtual target 1104.”).
Wang does not explicitly teach but Tong teaches, to obtain a traffic status of the simulated ramp, the at least one second virtual simulated vehicle traveling in the simulated ramp comprising the first virtual simulated vehicle; (See Tong paragraph 0030; “…The ramp control module is turned on. According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s, and the time occupancy rate Rtz of the ramp and the rightmost of the main road are respectively calculated according to the formula (2) in step 38 The time occupancy rates of the two lanes Rtm1 and Rtm2. If the ramp occupancy rate Rtz⟩Rtm1 and Rtz⟩Rtm2 within 10s, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way.”);and controlling, in a simulation prediction stage after the simulation reproduction stage, based on the traffic status of the simulated ramp obtained in the simulation reproduction stage; (See Tong paragraph 0039-0041; “The method for highway management and control in a V2X environment based on sumo simulation in this embodiment includes the following steps:
Step 1: Import the existing real road network through the well-known open source map database OpenStreetMap or build the road network you want in Sumo's netedit software;
Step 2: On the basis of Step 1, build a V2X vehicle simulation environment and manage and control the vehicle auxiliary control at the micro level.”); a driving behavior of a third virtual simulated vehicle traveling in the simulated ramp according to the autonomous driving model corresponding to the simulated ramp, to obtain a predicted traffic status of the simulated ramp; (See Tong paragraph 0028-0030; “Step 38. Turn on the variable lane module and calculate the time occupancy rate Rt of the road:
Among them, tT is the total observation time, Rt is the occupancy rate of the road, if Rt⟩70%, the emergency lane will be opened at the next time step, otherwise it will not be opened;
Step 39. The ramp control module is turned on. According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s, and the time occupancy rate Rtz of the ramp and the rightmost of the main road are respectively calculated according to the formula (2) in step 38 The time occupancy rates of the two lanes Rtm1 and Rtm2. If the ramp occupancy rate Rtz⟩Rtm1 and Rtz⟩Rtm2 within 10s, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way.”; Also See Tong paragraph 0048-0050; “Step 2.3: Simulation and micro-control of V2X vehicles
For the V2X intelligent networked vehicles that exist in the road network, the following definition and control of their vehicle behavior are carried out:
1)The basic operating rules of V2X intelligent networked vehicles are as follows: drive smoothly according to the scheduled driving route, observe high-speed traffic regulations, maintain a certain safe following distance with the vehicle in front, and fully consider the time node when changing lanes to ensure safety.”); the predicted traffic status being configured to control a traveling state of a physical vehicle traveling on a physical road corresponding to the simulated ramp; (See Tong paragraph 0048-0050; “Step 2.3: Simulation and micro-control of V2X vehicles
For the V2X intelligent networked vehicles that exist in the road network, the following definition and control of their vehicle behavior are carried out:
1)The basic operating rules of V2X intelligent networked vehicles are as follows: drive smoothly according to the scheduled driving route, observe high-speed traffic regulations, maintain a certain safe following distance with the vehicle in front, and fully consider the time node when changing lanes to ensure safety.”).
Both Wang and Tong are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Tong predicted traffic status of the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding predicted traffic status of the simulated ramp to allow a better prediction of a simulated data with prediction of traffic in the simulation, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 5 Wang in view of Tong teaches the method according to claim 4, Wang also teaches, further comprising: generating, in response to that a sensing coverage region exists in a downstream region of the simulated on-ramp, a first vehicle removal line perpendicular to a traveling direction of the simulated on-ramp at an upstream edge of the sensing coverage region, the downstream region of the simulated on-ramp belonging to the simulated main road, and the sensing coverage region belonging to the sensing region with the sensed data; and removing, from the driving simulation system, a virtual simulated vehicle that is one of the at least one second virtual simulated vehicle and that travels to the first vehicle removal line; (See Wang paragraph 0056-0057; “An example merging situation in a merging zone 1000 is illustrated in FIG. 10A. As depicted, the merging zone 1000 may include multiple road segments, the road segments of the merging zone 1000 may include a freeway segment 1002 and an entrance ramp segment 1004 of a freeway. In some embodiments, the freeway segment 1002 may include one or more freeway lanes, the one or more freeway lanes may include a ramp adjacent lane 1006 directly adjacent to the entrance ramp segment 1004. For example, the ramp adjacent lane 1006 may be the right-most lane or the left-most lane among the one or more freeway lanes of the freeway segment 1002. In some embodiments, the entrance ramp segment 1004 may include a freeway entrance lane 1008 from which the merging vehicles may shift to the ramp adjacent lane 1006 to enter the freeway segment 1002. In this example, the entrance ramp segment 1004 may include one lane, although it should be understood that other scenarios are applicable, such as but not limited to those in which the entrance ramp segment includes multiple lanes. As depicted in FIG. 10A, the ramp adjacent lane 1006 of the freeway segment 1002 may merge with the freeway entrance lane 1008 of the entrance ramp segment 1004 at the merging point 1005. In this example, the freeway entrance lane 1008 may be considered the first lane and the freeway entrance lane 1008 may be considered the second lane. It should be understood that the first lane and the second lane merging with one another may be located on different road segments of the merging zone (e.g., the freeway segment 1002 and the entrance ramp segment 1004) or on the same road segment of the merging zone (e.g., the entrance ramp segment 1004).
As depicted in FIG. 10A, the freeway segment 1002 and the entrance ramp segment 1004 may be provided with one or more roadside units 107 located along their roadway to transmit data to and/or receive data from the vehicle platforms 103 traveling on these road segments. In some embodiments, these vehicle platforms 103 may transmit data to and/or receive data from the roadside units 107 as they reach the communication start point 1003 in their corresponding lane. In some embodiments, the vehicle platforms 103 traveling on the road segments of the merging zone may include one or more merging vehicles traveling on the entrance ramp segment and one or more freeway vehicles traveling on the freeway segment. As an example, in the merging situation depicted in FIG. 10A, the merging vehicles may include the merging vehicles M1010, M1012, M1014 traveling on the entrance ramp segment 1004, the freeway vehicles may include the freeway vehicles F1020, F1022 traveling in the ramp adjacent lane 1006 and other freeway vehicles traveling in other lanes of the freeway segment 1002.”).
Regarding claim 6 Wang in view of Tong teaches the method according to claim 5, Wang also teaches, further comprising: determining, as a first vehicle in the simulated on-ramp, a virtual simulated vehicle that is one of the at least one second virtual simulated vehicle and that is closest to a downstream edge of the simulated on-ramp; and determining a maximum vehicle speed of the first vehicle according to historical data corresponding to the simulated on-ramp; and determining a vehicle other than the first vehicle in the at least one second virtual simulated vehicle as an upstream vehicle in the simulated on-ramp; (See Wang paragraph 0060; “FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103a may travel in the ramp adjacent lane 920, the merging vehicle 103c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103a, the merging vehicle 103c, the freeway vehicle 103d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103e and the freeway vehicle 103b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.”); determining a maximum vehicle speed of the upstream vehicle according to a road type corresponding to the simulated on-ramp; and controlling the driving behavior of the at least one second virtual simulated vehicle traveling in the simulated on-ramp according to an autonomous driving model corresponding to the simulated on-ramp, comprising: controlling the driving behavior of the at least one second virtual simulated vehicle traveling in the simulated on-ramp according to the autonomous driving model corresponding to the simulated on-ramp, the maximum vehicle speed of the upstream vehicle, and the maximum vehicle speed of the first vehicle; (See Wang paragraph 0059; “FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103a may travel in the ramp adjacent lane 920, the merging vehicle 103c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103a, the merging vehicle 103c, the freeway vehicle 103d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103e and the freeway vehicle 103b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.”).
Regarding claim 7 Wang in view of Tong teaches the method according to claim 1, Wang further teaches, wherein the simulated ramp includes a simulated off-ramp, the at least one second virtual simulated vehicle traveling in the simulated off-ramp further comprises: a sixth virtual simulated vehicle that is in a fifth virtual simulated vehicle traveling in the simulated main road and that travels from the simulated main road to the simulated off-ramp; (See Wang paragraph 0058-0059; “…the merging zone may include a conflict area. As discussed elsewhere herein, the conflict area may be the area of the merging zone in which the vehicle platforms 103 in different lanes likely perform one or more lane changes to shift to a particular lane (e.g., the ramp adjacent lane 1006) to enter or prepare to exit the freeway. Thus, these vehicle platforms 103 could target the same opening space in that particular lane or likely cross paths with other vehicle platforms 103 due to the lane changes. As a result, the likelihood of collision between the vehicle platforms 103 in the conflict area is usually higher than other areas of the merging zone…FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103a may travel in the ramp adjacent lane 920, the merging vehicle 103c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103a, the merging vehicle 103c, the freeway vehicle 103d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103e and the freeway vehicle 103b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.”); an upstream region of the simulated off-ramp is a sensing blank region, and the upstream region belongs to the simulated main road, the sensing blank region being connected to a demerging point of the simulated off-ramp; and the method further comprises removing, from the driving simulation system, a virtual simulated vehicle that is one of the at least one second virtual simulated vehicle and that travels to a downstream edge of the simulated off-ramp; (See Wang paragraph 0094 and 0096; “FIG. 5, in block 506, the merging plan processor 204 may optionally determine a position range for positioning the merging vehicle in the second lane based on the simulated position of the reference vehicle in the second lane indicated by the virtual target. In some embodiments, the position range may include a minimum (min) region, a safe region, and a max region. Each region may be a portion of the second lane located behind the simulated position of the reference vehicle. In some embodiments, the min region may specify a portion of the second lane that satisfies a minimal threshold distance to the simulated position of the reference vehicle (e.g., [40 m, 60 m]). The merging vehicle should not be ahead of the min region to avoid a potential collision with the reference vehicle when the merging vehicle merges with the reference vehicle in the first lane. In some embodiments, the safe region may specify a portion of the second lane that satisfies a safe threshold distance to the simulated position of the reference vehicle (e.g., [60 m, 115 m]). The merging vehicle should maintain its position within the safe region to smoothly merge with the reference vehicle in the first lane as the merging vehicle reaches the merging point. In some embodiments, the max region may specify a portion of the second lane that satisfies a maximal threshold distance to the simulated position of the reference vehicle (e.g., [115 m, 185 m]). The merging vehicle should not be behind the max region so that other vehicles may not arrive at the merging point between the merging vehicle and the reference vehicle, thereby avoiding the need to abruptly change the merging plan of the merging vehicle to adapt accordingly.
In block 508, the virtual assistance information renderer 208 may optionally overlay a virtual position indicator indicating the position range for the merging vehicle in the field of view of the driver of the merging vehicle. For example, as depicted in FIG. 11A, the virtual assistance information renderer 208 may render the virtual position indicator 1140 in the field of view 1100 of the driver of the merging vehicle. As shown, the virtual position indicator 1140 may be rendered relative to the virtual target 1104 on the front display surface 1120 and may indicate the min region, the safe region, the max region of the position range that are located behind the simulated position of the reference vehicle indicated by the virtual target 1104. In some embodiments, the virtual assistance information renderer 208 may also render a merging instruction 1142 in the field of view 1100 instructing the driver of the merging vehicle to follow the virtual target 1104 to smoothly perform the merging process. To follow the virtual target 1104, the driver of the merging vehicle may position the merging vehicle in the lane 1122 according to the regions indicated by the virtual position indicator 1140, thereby maintaining an appropriate following distance to the simulated position of the reference vehicle 1102 indicated by the virtual target 1104. As the merging vehicle maintains an appropriate following distance to the simulated position of the reference vehicle, the merging vehicle can smoothly merge with the reference vehicle as the merging vehicle reaches the merging point.”).
Regarding claim 8 Wang in view of Tong teaches the method according to claim 7, Wang also teaches, further comprising: determining, according to a distance between the fifth virtual simulated vehicle and the demerging point of the simulated off-ramp, the sixth virtual simulated vehicle that is in the fifth virtual simulated vehicle and that enters the simulated off-ramp; (See Wang paragraph 0083-0084; “FIG. 10B illustrates another merging situation in a merging zone 1050 in which the entrance ramp segment 1054 includes multiple lanes that eventually merge into a single lane. In this example, the reference vehicle processor 202 may determine the reference vehicle corresponding to the freeway vehicles F1072, F1076, F1078 to be the freeway vehicle F1070 and the merging vehicles M1060, M1066, and determine the reference vehicle for the merging vehicles M1060, M1062, M1064, M1066 to be the freeway vehicles F1072, F1076 and the merging vehicles M1062, M1064 as indicated by the arrows 1061, 1063, 1065, 1067, 1069, 1071, 1073 in FIG. 10B. In this example, the merging vehicle M1066 is in the lane 1059 while its reference vehicle M1064 is in the lane 1058 of the entrance ramp segment 1054. As depicted in FIG. 10B, the simulated position of the reference vehicle M1064 in the lane 1059 is illustrated with the virtual representation V1084. Similarly, the freeway vehicle F1078 is in the lane 1056 of the freeway segment 1052 while its reference vehicle M1066 is in the lane 1059 of the entrance ramp segment 1054. As depicted in FIG. 10B, the simulated position of the reference vehicle M1066 in the lane 1056 is illustrated with the virtual representation V1086.
Referring back to FIG. 5, once the merging plan processor 204 determines the simulated position of the reference vehicle corresponding to the merging vehicle, the virtual assistance information renderer 208 may render merging assistance information to the driver of the merging vehicle based on the simulated position of the reference vehicle in the second lane. In some embodiments, the merging assistance information may facilitate the driver of the merging vehicle in performing the merging process of the merging vehicle. The merging assistance information may include one or more of a virtual target, a safety information indicator, a virtual position indicator, etc., rendered in the field of view of the driver of the merging vehicle in a perceptive and practical manner. Other types of merging assistance information are also possible and contemplated.”).
Regarding claim 9 Wang in view of Tong teaches the method according to claim 8, Wang does not explicitly teach but Tong teaches, wherein first starting target information of the fifth virtual simulated vehicle is obtained in response to that the fifth virtual simulated vehicle enters a road segment range at a distance from the demerging point less than a first preset distance in the simulation reproduction stage; and first current lane information of the fifth virtual simulated vehicle is obtained in response to that the first starting target information is the simulated off-ramp; and in response to that the first current lane information matches the first starting target information, the fifth virtual simulated vehicle traveling to the simulated off-ramp according to the first current lane information is determined as the sixth virtual simulated vehicle; (See Tong paragraph 0084-0089; “According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s.
Step2: Calculate the time occupancy Rtz, Rtm1 and Rtm2 of the ramp and the two rightmost lanes of the main road respectively according to the formula (2) in the third module. If the occupancy rate of the ramp within 10s is Rtz⟩Rtm1 and Rtz⟩Rtm2, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way.
Step3: Adjust the ramp signal light to complete the control according to the instructions of Step2.
According to the existing bayonet data and other information, the test was carried out in the actual road scene described in Figure 2, and the reinforcement learning algorithm was used. The final comparison results of the scheme are shown in Table 4 below:
Table 4 Evaluation table of results of management and control plan
According to the above results, the best variable speed limit value obtained by the patented highway management and control method in the above highway scene is 110km/h, and large vehicles are allowed to drive in the two lanes on the right, and the emergency lane is opened. The time required for simulation calculation is about 40s. Compared with a control system with a time step of 30min, it has almost reached the level of real-time control combined with a variety of control methods.”).
Regarding claim 10 Wang in view of Tong teaches the method according to claim 9, Wang also teaches, further comprising: determining first starting target information of the fifth virtual simulated vehicle when the fifth virtual simulated vehicle enters a road segment range at a distance from the demerging point greater than a first preset distance and less than a second preset distance, the determining first starting target information comprising; (See Wang paragraph 0100; “In some embodiments, for a first proximate vehicle among the proximate vehicles, the safety factor processor 206 may determine the safety factors associated with the first proximate vehicle, the safety factors may describe the risk of a traffic accident between the merging vehicle and the first proximate vehicle. In some embodiments, the safety factor processor 206 may analyze the vehicle movement data of the first proximate vehicle determined from the sensor data as discussed above, and determine the vehicle speed of the first proximate vehicle (e.g., 23 m/s), the distance between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle (e.g., 425 m), the collision metric indicating the estimated collision probability between the merging vehicle and the first proximate vehicle, the lane change probability that the first proximate vehicle may shift from its current lane to the lane of the merging vehicle, etc. In some embodiments, to determine the collision metric, the safety factor processor 206 may determine the travel time Δ.sub.t_longitudinal for the merging vehicle to travel the distance ahead between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle using the longitudinal speed of the merging vehicle (e.g., 3 s), and determine the collision metric corresponding to the travel time Δ.sub.t_longitudinal (e.g., 12%). In some embodiments, if the current lane of the first proximate vehicle is directly adjacent to the lane of the merging vehicle, the safety factor processor 206 may determine the travel time Δ.sub.t_lateral for the first proximate vehicle to travel the distance between its lateral position and the lane boundary between these two lanes (e.g., 1 s), and determine the lane change probability corresponding to the travel time Δ.sub.t_lateral (e.g., 60%). Other safety factors associated with the first proximate vehicle are also possible and contemplated.”); determining a first basic probability of the fifth virtual simulated vehicle for the simulated off-ramp; and generating a first random probability for the fifth virtual simulated vehicle; and determining the first starting target information according to the first basic probability and the first random probability; (See Wan paragraph 0059-0060; “As an example of the conflict area, FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103a may travel in the ramp adjacent lane 920, the merging vehicle 103c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103a, the merging vehicle 103c, the freeway vehicle 103d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103e and the freeway vehicle 103b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.
By way of illustration, FIG. 3 is a flowchart of an example method 300 for providing virtualized driver assistance to a driver. In some embodiments, the virtualized driver assistance application 120 may perform the method 300 to assist the driver of a merging vehicle to smoothly merge from the second lane to the first lane in the merging zone. As discussed elsewhere herein, the first lane may merge with the second lane at the merging point. In block 302, the reference vehicle processor 202 may determine a reference vehicle traveling in the first lane, the merging vehicle in the second lane may rely on the reference vehicle in the first lane to smoothly merge from the second lane to the first lane as the merging vehicle reaches the merging point. As an example, in the merging situation depicted in FIG. 10A, the merging vehicle M1012 may travel in the freeway entrance lane 1008 of the entrance ramp segment 1004. In this example, the reference vehicle processor 202 may determine a reference vehicle traveling in the ramp adjacent lane 1006 of the freeway segment 1002 to which the merging vehicle M1012 may reference in order to smoothly merge from the freeway entrance lane 1008 to the ramp adjacent lane 1006.”).
Regarding claim 11 Wang in view of Tong teaches the method according to claim 10, Wang further teaches, wherein the generating a first random probability for the fifth virtual simulated vehicle comprises: determining a first target selection location corresponding to the fifth virtual simulated vehicle according to an aggressive parameter corresponding to the fifth virtual simulated vehicle, the first preset distance, and the second preset distance (See Wang paragraph 0078-0079; “In block 658, the merging plan processor 204 may adjust the simulated position of the reference vehicle in the second lane based on the driving behavior data associated with the driver of the reference vehicle and the sensor data of the merging vehicle. In some embodiments, the merging plan processor 204 may analyze the driving behavior data associated with the driver of the reference vehicle, and assign an aggressive metric indicating the level of driving aggressiveness to the driver. As an example, the driving behavior data of the driver may indicate that the driver usually drives above the speed limit with an abrupt braking pattern and a short following distance (e.g., less than 2 s). Thus, the merging plan processor 204 may determine that the driver of the reference vehicle often drives aggressively and assign the aggressive metric of 0.87 to the driver. In some embodiments, the merging plan processor 204 may adjust the simulated position of the reference vehicle in the second lane based on the aggressive metric of its driver. For example, the merging plan processor 204 may adjust the distance between the simulated position of the reference vehicle and the vehicle position of the merging vehicle in the second lane to be directly proportional to the aggressive metric associated with the driver of the reference vehicle. As discussed elsewhere herein, the simulated position of the reference vehicle may be indicated by a virtual target, and the merging vehicle may adjust its vehicle movement based on the virtual target. Thus, if the reference vehicle is operated by an aggressive driver, the merging vehicle may maintain a sufficient following distance relative to the simulated position of the reference vehicle indicated by the virtual target in which the simulated position of the reference vehicle is adjusted to be further away, thereby reducing the risk of collision in case the vehicle speed of the reference vehicle is abruptly changed when the merging vehicle merges with the reference vehicle in the first lane.); a larger aggressive parameter indicating that the first target selection location is closer to the demerging point (See Wang paragraph 0059; “As an example of the conflict area, FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103 a may travel in the ramp adjacent lane 920, the merging vehicle 103 c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103 d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103 a, the merging vehicle 103 c, the freeway vehicle 103 d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103 e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103 b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103 e and the freeway vehicle 103 b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.”); and generating the first random probability for the fifth virtual simulated vehicle when the fifth virtual simulated vehicle travels to the first target selection location (See Wang paragraph 0100-0101; “In some embodiments, for a first proximate vehicle among the proximate vehicles, the safety factor processor 206 may determine the safety factors associated with the first proximate vehicle, the safety factors may describe the risk of a traffic accident between the merging vehicle and the first proximate vehicle. In some embodiments, the safety factor processor 206 may analyze the vehicle movement data of the first proximate vehicle determined from the sensor data as discussed above, and determine the vehicle speed of the first proximate vehicle (e.g., 23 m/s), the distance between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle (e.g., 425 m), the collision metric indicating the estimated collision probability between the merging vehicle and the first proximate vehicle, the lane change probability that the first proximate vehicle may shift from its current lane to the lane of the merging vehicle, etc. In some embodiments, to determine the collision metric, the safety factor processor 206 may determine the travel time Δ.sub.t_longitudinal for the merging vehicle to travel the distance ahead between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle using the longitudinal speed of the merging vehicle (e.g., 3 s), and determine the collision metric corresponding to the travel time Δ.sub.t_longitudinal (e.g., 12%). In some embodiments, if the current lane of the first proximate vehicle is directly adjacent to the lane of the merging vehicle, the safety factor processor 206 may determine the travel time Δ.sub.t_lateral for the first proximate vehicle to travel the distance between its lateral position and the lane boundary between these two lanes (e.g., 1 s), and determine the lane change probability corresponding to the travel time Δ.sub.t_lateral (e.g., 60%). Other safety factors associated with the first proximate vehicle are also possible and contemplated. In some embodiments, the virtual assistance information renderer 208 may optionally overlay one or more safety information indicators that indicate the safety factors associated with one or more proximate vehicles in the field of view of the driver of the merging vehicle. The safety information indicator may be rendered in association with the corresponding proximate vehicle and may indicate one or more of the vehicle speed of the proximate vehicle, the distance between the merging vehicle and the proximate vehicle, the collision metric between the merging vehicle and the proximate vehicle (e.g., if the collision metric satisfies a collision metric threshold, e.g., higher than 50%), the lane change probability of the proximate vehicle (e.g., if the lane change probability satisfies a lane change probability threshold, e.g., higher than 55%), etc. In some embodiments, if the blind spot status indicates that the blind spot area of the merging vehicle is occupied, the virtual assistance information renderer 208 may also overlay a safety information indicator that indicates the blind spot status in the field of view of the driver of the merging vehicle. In some embodiments, the virtual assistance information renderer 208 may render these safety information indicators in the front field of view and/or the side field of view of the driver of the merging vehicle. As discussed elsewhere herein, the front field of view may reflect the area of the roadway environment that the driver observes when looking ahead, and the side field of view may reflect the area of the roadway environment that the driver observes when looking to the side (e.g., left or right).”).
Regarding claim 12 Wang in view of Tong teaches the method according to claim 10, Wang further teaches, wherein the determining the first starting target information according to the first basic probability and the first random probability comprises: determining the first starting target information as the simulated off-ramp in response to that the first basic probability is greater than or equal to the first random probability; (See Wang paragraph 0100; “ for a first proximate vehicle among the proximate vehicles, the safety factor processor 206 may determine the safety factors associated with the first proximate vehicle, the safety factors may describe the risk of a traffic accident between the merging vehicle and the first proximate vehicle. In some embodiments, the safety factor processor 206 may analyze the vehicle movement data of the first proximate vehicle determined from the sensor data as discussed above, and determine the vehicle speed of the first proximate vehicle (e.g., 23 m/s), the distance between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle (e.g., 425 m), the collision metric indicating the estimated collision probability between the merging vehicle and the first proximate vehicle, the lane change probability that the first proximate vehicle may shift from its current lane to the lane of the merging vehicle, etc. In some embodiments, to determine the collision metric, the safety factor processor 206 may determine the travel time Δ.sub.t_longitudinal for the merging vehicle to travel the distance ahead between the longitudinal position of the merging vehicle and the longitudinal position of the first proximate vehicle using the longitudinal speed of the merging vehicle (e.g., 3 s), and determine the collision metric corresponding to the travel time Δ.sub.t_longitudinal (e.g., 12%). In some embodiments, if the current lane of the first proximate vehicle is directly adjacent to the lane of the merging vehicle, the safety factor processor 206 may determine the travel time Δ.sub.t_lateral for the first proximate vehicle to travel the distance between its lateral position and the lane boundary between these two lanes (e.g., 1 s), and determine the lane change probability corresponding to the travel time Δ.sub.t_lateral (e.g., 60%). Other safety factors associated with the first proximate vehicle are also possible and contemplated. “); and determining the first starting target information as a downstream main road of the simulated off-ramp in response to that the first basic probability is less than the first random probability, the downstream main road of the simulated off-ramp belonging to the simulated main road, the downstream main road of the simulated off-ramp being connected to the sensing blank region, and the downstream main road of the simulated off-ramp not belonging to the sensing region with the sensed data; (See Wang paragraph 0101; “the virtual assistance information renderer 208 may optionally overlay one or more safety information indicators that indicate the safety factors associated with one or more proximate vehicles in the field of view of the driver of the merging vehicle. The safety information indicator may be rendered in association with the corresponding proximate vehicle and may indicate one or more of the vehicle speed of the proximate vehicle, the distance between the merging vehicle and the proximate vehicle, the collision metric between the merging vehicle and the proximate vehicle (e.g., if the collision metric satisfies a collision metric threshold, e.g., higher than 50%), the lane change probability of the proximate vehicle (e.g., if the lane change probability satisfies a lane change probability threshold, e.g., higher than 55%), etc. In some embodiments, if the blind spot status indicates that the blind spot area of the merging vehicle is occupied, the virtual assistance information renderer 208 may also overlay a safety information indicator that indicates the blind spot status in the field of view of the driver of the merging vehicle. In some embodiments, the virtual assistance information renderer 208 may render these safety information indicators in the front field of view and/or the side field of view of the driver of the merging vehicle. As discussed elsewhere herein, the front field of view may reflect the area of the roadway environment that the driver observes when looking ahead, and the side field of view may reflect the area of the roadway environment that the driver observes when looking to the side (e.g., left or right).”).
With respect to independent claims 14 and 20, please see the rejection above with respect to claims 1 which is commensurate in scope to claims 14 and 20, with claim 1 being drown to the method, and claim 14 being drawn to a corresponding device and claim 20 to a non-transitory computer-readable storage medium.
With respect to dependent claims 18 and 19, please see the rejection above with respect to claims 5 and 6 which are commensurate in scope to claims 18 and 19, with claims 5 and 6 being drown to method, and claims 18 and 19being drawn to a corresponding device.
Claims 2-4, 13 and 15-17 are rejected under 35 U.S.C. 103 as being unpatentable over Wang (Patent No. US20200247412 A1) in view of Sun (Patent No. CN113342704A) and Tong (Patent No. CN113674546A).
Regarding claim 2 Wang in view of Tong teaches the method according to claim 1, Wang does not teach but Sun teaches, wherein the generating a first virtual simulated vehicle in the simulated ramp comprises: obtaining, in response to that historical data corresponding to the simulated ramp is not an empty set, historical data corresponding to the simulation starting moment from the historical data corresponding to the simulated ramp as a first starting traffic status corresponding to the simulated ramp, and generating the first virtual simulated vehicle in the simulated ramp according to the first starting traffic status; (See Sun paragraph 0099 and 00113; “…the scene playback information can be divided into two parts of scene playback information based on the information points in the above mode switching conditions, namely the first scene playback information for the playback scene simulation mode, and the second scene for the virtual scene simulation mode Replay information. Among them, the playback information of the first scene may include real data corresponding to one moment, or real data corresponding to multiple consecutive moments respectively; the playback information of the second scene may include real data corresponding to one moment, and may include multiple consecutive moments corresponding to each other. The real data, or the empty set… When the driving simulation system is in the playback scene simulation model, the driving simulation behavior of the target simulated vehicle is the playback driving simulation behavior, and the playback driving simulation behavior is determined by the scene playback information, assuming that the mode switching condition is that the driving simulation system accumulates running time to 40s. It is understandable that the scene playback information may include playback data corresponding to multiple consecutive moments, and one playback data may include the location information and trajectory information of the target vehicle at a certain moment (which may include driving direction and driving speed, etc.), and Location information of obstacle objects (which can include people, vehicles, and other objects, etc.). The data type of the playback data can be set according to actual application scenarios, and the embodiment of the present application does not limit the data type of the playback data.”).
Both Wang and Sun are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Sun generating and controlling virtual simulated vehicle in the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding generating and controlling virtual simulated vehicle in the simulated ramp to allow a better prediction of a simulated data, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Wang does not exactly teach but Tong teaches, and determining a second starting traffic status corresponding to the simulated ramp according to a target traffic status in a basic traffic graph corresponding to the simulated ramp in response to that the historical data corresponding to the simulated ramp is an empty set, and generating the first virtual simulated vehicle in the simulated ramp according to the second starting traffic status; (See Tong paragraph 0079-0082; “According to the real-time road conditions, start a round of calculation with a time interval of 30 minutes…Obtain the occupancy time ti of the i-th vehicle in the current time step through the coil, and the time occupancy rate of the road is calculated as:
Among them, Rt is the occupancy rate of the road, tT is the total observation time, and N is the number of vehicles on the road segment… If Rt⟩70%, the emergency lane will be opened at the next time step, otherwise it will not be opened.”; also see Tong paragraph 0084-0086; “Step1: According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s.
Step2: Calculate the time occupancy Rtz, Rtm1 and Rtm2 of the ramp and the two rightmost lanes of the main road respectively according to the formula (2) in the third module. If the occupancy rate of the ramp within 10s is Rtz⟩Rtm1 and Rtz⟩Rtm2, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way.
Step3: Adjust the ramp signal light to complete the control according to the instructions of Step2.”).
Both Wang and Tong are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Tong predicted traffic status of the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding predicted traffic status of the simulated ramp to allow a better prediction of a simulated data with prediction of traffic in the simulation, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 3 Wang in view of Tong teaches the method according to claim 2, Wang further teaches, wherein the generating the first virtual simulated vehicle in the simulated ramp according to the first starting traffic status comprises: determining an average vehicle spacing corresponding to the simulated ramp according to a vehicle density in the first starting traffic status; (See Wang paragraph 0059 and 0064; “FIG. 9 illustrates a conflict area 902 of a merging zone 900. The merging zone 900 may be a portion of a traffic interchange (e.g., cloverleaf interchange) and may include an entrance ramp segment 910 located relatively close to an exit ramp segment 912. As depicted in FIG. 9, the freeway vehicle 103a may travel in the ramp adjacent lane 920, the merging vehicle 103c may plan on shifting from the entrance ramp segment 910 to the ramp adjacent lane 920 to enter the freeway, and the freeway vehicle 103d may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the upcoming exit ramp segment 912 to exit the freeway. Thus, there is a potential collision in the conflict area 902 as the freeway vehicle 103a, the merging vehicle 103c, the freeway vehicle 103d may target the same opening space 930 in the ramp adjacent lane 920. As depicted in FIG. 9, the freeway vehicle 103e may plan on shifting from the freeway lane 922 to the ramp adjacent lane 920 to take the exit ramp segment 912 to exit the freeway, and the freeway vehicle 103b may plan on shifting from the ramp adjacent lane 920 to the freeway lane 922 to continue traveling on the freeway with higher speed. Thus, there is another potential collision in the conflict area 902 as the freeway vehicle 103e and the freeway vehicle 103b may cross paths with one another. As another example of the conflict area, in the merging situation depicted in FIG. 10A, the merging zone 1000 may include the conflict area 1001.
In block 404, the reference vehicle processor 202 may estimate one or more merging timestamps at which one or more vehicle platforms 103 traveling in the first lane and the second lane may arrive at the merging point between the first lane and the second lane. In some embodiments, to determine the estimated merging timestamp of a vehicle platform 103 traveling in the first lane or the second lane, the reference vehicle processor 202 may analyze the vehicle movement data of the vehicle platform 103 to extract the vehicle position and the vehicle speed (including the speed's rate of change in some cases) of the vehicle platform 103. The reference vehicle processor 202 may determine the distance d between the vehicle position of the vehicle platform 103 and the merging point (e.g., 240 m), and compute the travel time Δ.sub.t for the vehicle platform 103 to travel the distance d and reach the merging point using the longitudinal speed υ.sub.longitudial of the vehicle platform 103 (e.g., 30 m/s). For example, the reference vehicle processor 202 may compute the travel time Δ.sub.t=d/υ.sub.longitudial=8 s. The reference vehicle processor 202 may then compute the estimated merging timestamp t.sub.merging at which the vehicle platform 103 potentially arrives at the merging point using the current timestamp t.sub.current (e.g., 14:00:00). For example, the reference vehicle processor 202 may compute the estimated merging timestamp t.sub.merging=t.sub.current+Δ.sub.t=14:00:08.”).
Wang does not explicitly teach but Tong teaches, generating, in response to that the simulated ramp is a simulated on-ramp, the first virtual simulated vehicle in the simulated on-ramp, according to the average vehicle spacing, starting from a merging point of the simulated on-ramp in a direction opposite to a traveling direction of the simulated on-ramp; and generating, in response to that the simulated ramp is a simulated off-ramp, the first virtual simulated vehicle in the simulated off-ramp, according to the average vehicle spacing, starting from a demerging point of the simulated off-ramp in a traveling direction of the simulated off-ramp; (See Tong paragraph 0030 and 0058; “The ramp control module is turned on. According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s, and the time occupancy rate Rtz of the ramp and the rightmost of the main road are respectively calculated according to the formula (2) in step 38 The time occupancy rates of the two lanes Rtm1 and Rtm2. If the ramp occupancy rate Rtz⟩Rtm1 and Rtz⟩Rtm2 within 10s, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way… Parameter unit description RSU setting interval distance m Distance between adjacent RSUs RSU feedback time granularity min Time range corresponding to RSU return road condition information v2v Working distance m v2x Maximum distance time for vehicles to obtain other v2x vehicle status information s Simulation duration Flow Veh/ h Traffic volume v2x vehicle penetration rate% v2x vehicle percentage in the vehicle unmanned vehicle penetration rate% unmanned vehicle proportion in the vehicle compliance rate% The driver accepts the driving behavior recommendations given by the assisted driving system to the proportion of the control recommendations to comply with Rate% Drivers accept the traffic control recommendations given by the control system.”).
Both Wang and Tong are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Tong predicted traffic status of the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding predicted traffic status of the simulated ramp to allow a better prediction of a simulated data with prediction of traffic in the simulation, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 4 Wang in view of Tong teaches the method according to claim 1, Wang does not explicitly teach but Sun teaches, wherein the simulated ramp includes a simulated on-ramp, the method further comprises: determining a vehicle outputting region in the simulated on-ramp; and generating a fourth virtual simulated vehicle in the vehicle outputting region; and the at least one second virtual simulated vehicle traveling in the simulated ramp in the simulation reproduction stage further comprises the fourth virtual simulated vehicle; (See Sun paragraph 00215-00220; “Referring again to FIG. 6, the data processing apparatus 1 may further include: a fourth obtaining module 19. The fourth acquisition module 19 is used to acquire the initial driving simulation model; the tested driving simulation model is an updated version of the initial driving simulation model; The fourth acquisition module 19 is also used to acquire test scene information for the target simulation vehicle; the test scene information includes virtual target roads, virtual obstacle objects, virtual obstacle positions corresponding to the virtual obstacle objects, and initial virtual vehicle corresponding to the target simulation vehicle…The fourth acquisition module 19 is also used to input the test scene information into the initial driving simulation model, and output virtual driving data of the target simulated vehicle through the initial driving simulation model; The fourth acquisition module 19 is also used to determine virtual driving data and test scene information as scene playback information.
For the specific functional implementation of the fourth acquiring module 19, refer to step S101 in the embodiment corresponding to FIG. 3 above, and details are not described herein again.”).
Both Wang and Sun are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Sun generating and controlling virtual simulated vehicle in the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding generating and controlling virtual simulated vehicle in the simulated ramp to allow a better prediction of a simulated data, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Regarding claim 13 Wang in view of Tong teaches the method according to claim 10, Wang further teaches, and in response to that the historical data corresponding to the simulated off-ramp is an empty set, and the historical data corresponding to the downstream main road of the simulated off-ramp is an empty set, obtaining a first lane count of the simulated off-ramp and a second lane count of the downstream main road of the simulated off-ramp, determining a lane count sum of the first lane count and the second lane count, and determining a ratio of the first lane count to the lane count sum as the first basic probability; (See Wang paragraph 0094 and 0096; “FIG. 5, in block 506, the merging plan processor 204 may optionally determine a position range for positioning the merging vehicle in the second lane based on the simulated position of the reference vehicle in the second lane indicated by the virtual target. In some embodiments, the position range may include a minimum (min) region, a safe region, and a max region. Each region may be a portion of the second lane located behind the simulated position of the reference vehicle. In some embodiments, the min region may specify a portion of the second lane that satisfies a minimal threshold distance to the simulated position of the reference vehicle (e.g., [40 m, 60 m]). The merging vehicle should not be ahead of the min region to avoid a potential collision with the reference vehicle when the merging vehicle merges with the reference vehicle in the first lane. In some embodiments, the safe region may specify a portion of the second lane that satisfies a safe threshold distance to the simulated position of the reference vehicle (e.g., [60 m, 115 m]). The merging vehicle should maintain its position within the safe region to smoothly merge with the reference vehicle in the first lane as the merging vehicle reaches the merging point. In some embodiments, the max region may specify a portion of the second lane that satisfies a maximal threshold distance to the simulated position of the reference vehicle (e.g., [115 m, 185 m]). The merging vehicle should not be behind the max region so that other vehicles may not arrive at the merging point between the merging vehicle and the reference vehicle, thereby avoiding the need to abruptly change the merging plan of the merging vehicle to adapt accordingly.
In block 508, the virtual assistance information renderer 208 may optionally overlay a virtual position indicator indicating the position range for the merging vehicle in the field of view of the driver of the merging vehicle. For example, as depicted in FIG. 11A, the virtual assistance information renderer 208 may render the virtual position indicator 1140 in the field of view 1100 of the driver of the merging vehicle. As shown, the virtual position indicator 1140 may be rendered relative to the virtual target 1104 on the front display surface 1120 and may indicate the min region, the safe region, the max region of the position range that are located behind the simulated position of the reference vehicle indicated by the virtual target 1104. In some embodiments, the virtual assistance information renderer 208 may also render a merging instruction 1142 in the field of view 1100 instructing the driver of the merging vehicle to follow the virtual target 1104 to smoothly perform the merging process. To follow the virtual target 1104, the driver of the merging vehicle may position the merging vehicle in the lane 1122 according to the regions indicated by the virtual position indicator 1140, thereby maintaining an appropriate following distance to the simulated position of the reference vehicle 1102 indicated by the virtual target 1104. As the merging vehicle maintains an appropriate following distance to the simulated position of the reference vehicle, the merging vehicle can smoothly merge with the reference vehicle as the merging vehicle reaches the merging point.”).
Wang does not explicitly teach but Sun teaches, wherein the determining a first basic probability of the fifth virtual simulated vehicle for the simulated off-ramp comprises: in response to that historical data corresponding to the simulated off-ramp is not an empty set, and historical data corresponding to the downstream main road of the simulated off-ramp is not an empty set, obtaining a corresponding off-ramp vehicle flow from the historical data corresponding to the simulated off-ramp; (See Sun paragraph 0099 and 00113; “…the scene playback information can be divided into two parts of scene playback information based on the information points in the above mode switching conditions, namely the first scene playback information for the playback scene simulation mode, and the second scene for the virtual scene simulation mode Replay information. Among them, the playback information of the first scene may include real data corresponding to one moment, or real data corresponding to multiple consecutive moments respectively; the playback information of the second scene may include real data corresponding to one moment, and may include multiple consecutive moments corresponding to each other. The real data, or the empty set… When the driving simulation system is in the playback scene simulation model, the driving simulation behavior of the target simulated vehicle is the playback driving simulation behavior, and the playback driving simulation behavior is determined by the scene playback information, assuming that the mode switching condition is that the driving simulation system accumulates running time to 40s. It is understandable that the scene playback information may include playback data corresponding to multiple consecutive moments, and one playback data may include the location information and trajectory information of the target vehicle at a certain moment (which may include driving direction and driving speed, etc.), and Location information of obstacle objects (which can include people, vehicles, and other objects, etc.). The data type of the playback data can be set according to actual application scenarios, and the embodiment of the present application does not limit the data type of the playback data.”).
Both Wang and Sun are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Sun generating and controlling virtual simulated vehicle in the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding generating and controlling virtual simulated vehicle in the simulated ramp to allow a better prediction of a simulated data, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
Wang does not explicitly teach but Tong teaches, and obtaining a corresponding downstream main road vehicle flow from the historical data corresponding to the downstream main road of the simulated off-ramp; determining a vehicle flow sum of the off-ramp vehicle flow and the downstream main road vehicle flow, and determining a ratio of the off-ramp vehicle flow to the vehicle flow sum as the first basic probability of the fifth virtual simulated vehicle for the simulated off-ramp; (See Tong paragraph 0017-0019; “Step 22. Read the flow rate, simulation time, V2X intelligent networked vehicle ratio, L5 unmanned vehicle ratio parameters, combined with the measured flow data, generate a .rou.xml file that defines the vehicle, traffic flow and its path in the simulation, in .sumocfg Realize the call in the simulation configuration file; Step 23. Read the parameters of the maximum sensing range of the sensing device, the maximum distance of vehicle-to-vehicle communication, and the maximum distance of vehicle-device communication. In each simulation step, each V2X vehicle will receive the traffic operation within its sensing range and communication range status information; Step 24. In each simulation step, according to the information feedback received by each vehicle, the operation of the vehicle is controlled according to the vehicle function settings. The vehicle function modules are divided into driving control, driving advice and smart navigation. The three functions are all Switch settings can be achieved through parameter rewriting”; also see Tong paragraph 0030; “Step 39. The ramp control module is turned on. According to the real-time headway data collected by the coil, a round of calculation is initiated with a time interval Δt of 10s, and the time occupancy rate Rtz of the ramp and the rightmost of the main road are respectively calculated according to the formula (2) in step 38 The time occupancy rates of the two lanes Rtm1 and Rtm2. If the ramp occupancy rate Rtz⟩Rtm1 and Rtz⟩Rtm2 within 10s, the ramp has the right of way and the green light is open for at least 20s and at most 90s. Otherwise, the ramp has no right of way.”).
Both Wang and Tong are in the same field of data processing method and apparatus for road simulation. It would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to modify Wang computer device running a driving simulation system with Tong predicted traffic status of the simulated ramp. No new functionality would arise from the combination and the combination would improve usability of Wang by adding predicted traffic status of the simulated ramp to allow a better prediction of a simulated data with prediction of traffic in the simulation, one of ordinary skill in the art would have recognized that the results of the combination were predictable.
With respect to dependent claims 15-17, please see the rejection above with respect to claims 2-4 which are commensurate in scope to claims 15-17, with claims 2-4 being drown to method, and claims 15-17 being drawn to a corresponding device.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LIDIA KWIATKOWSKA whose telephone number is (571)272-5161. The examiner can normally be reached Monday-Friday 8:00-5:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at (571) 270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/L.K./Examiner, Art Unit 3666
/SCOTT A BROWNE/Supervisory Patent Examiner, Art Unit 3666