DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Claims
This Office Action is in response to the claims filed on October 7, 2025.
Claims 1-20 have been presented for examination.
Claims 1-20 are currently rejected.
Claims 8-20 are rejected under 35 U.S.C. 101.
Claims 1-3, 5-10, 12-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cella et al. (U.S. Patent Publication Number 2023/0058169) in view of Isele (U.S. Patent Publication Number 2020/0391738).
Claims 4, 11, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Cella et al. (U.S. Patent Publication Number 2023/0058169) in view of Isele (U.S. Patent Publication Number 2020/0391738), further in view of Qi et al. (U.S. Patent Publication Number 2021/0081787).
Response to Arguments
Applicant's arguments filed on October 7, 2025 have been fully considered but they are not persuasive.
35 U.S.C. 101
Regarding 35 U.S.C. 101, the Applicant argues that the claims integrate the abstract idea into a practical application by including the limitation “instructions that, as a result of being executed by one or more processors of a computer system,” and “caus[ing] an autonomous machine to perform one or more actions based on the updated traffic model” (see Applicant Remarks page 9).
The Examiner has considered the arguments presented and respectfully disagrees. Under 35 U.S.C. 101 Analysis Step 2A Prong 2, the courts have indicated that additional elements such as: merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
With respect to claims 8 and 15, the recited executable instructions are recited at a high level of generality and amount to data being used in a generic or general-purpose computing environment. Further, the “caus[ing] the autonomous machine to perform one or more actions based on the updated traffic model” is merely an intended result from the mental process steps and extra-solution activity, such that execution of those steps would “cause” the action to be performed, but performing the action itself is not positively recited.
Additionally, dependent claims 9-14 and 16-20 do not recite any further limitations that cause the claim(s) to be patent eligible.
The Applicant further argues that such application would improve performance and improve accuracy of traffic models used to cause autonomous machines to perform such actions. The Applicant argues that the practical application is to reflect more realistic traffic scenarios, which solves a technical problem with a technical solution (see Applicant Remarks page 9).
The Examiner has considered the arguments presented and respectfully disagrees. Assertions of improvement in a technological field must not be directed to an abstract idea (see MPEP 2106) In Berkheimer v. HP INC., 881 F. 3d 1360 (Fed. Cir. 2018), the federal circuit held that improvements are only considered “to the extent they are captured in the claims.” Berkheimer at 1369. Further, claims are only integrated into practical application when the alleged improvement is not directed to the judicial exception. See MPEP 2106.04(d)(2). In the present application, the improvements presented by the Applicant are not expressly captured in the claims. Even so, the improvement is directed to the judicial exception as the additional elements included in the claims are recited at a high level of generality and do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
For these reasons, the Examiner maintains the 35 U.S.C. 101 rejection for claims 8-20.
Should the claim be intended to control the vehicle to perform the one or more actions, the Examiner suggests amending the limitation to recite “controlling the autonomous machine to perform one or more actions based on the updated traffic model.”
35 U.S.C. 103
The Applicant further argues that the combination of Cella and Isele fail to teach or suggest “accessing preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes” (see Applicant Remarks page 10). The Applicant appears to describe the disclosure of Cella (see Applicant Remarks page 10-11) and concludes that Cella does not suggest that a user ranking of traffic scenes, nor a ranking of realism of traffic scenes, is used by a user.
The Examiner has considered the arguments presented and respectfully disagrees. First, the Applicant merely appears to describe the disclosure of Cella and conclude that Cella does not teach or suggest the recited claim elements, but does not provide contrary evidence or explanation establishing that the reference being relied on would not enable a skilled artisan to produce the recited limitations.
Even so, Cella ¶ 463 discloses a search result display ranking circuit that orders (i.e., ranks) search results based on a relevance, wherein the rankings are based on one or more metrics defined by a user, disclosed in ¶ 542. One having ordinary skill in the art would further recognize that ordering the results based on relevance, such as ranking favorable results based on proximity to the route, see ¶ 462, is ranking the one or more traffic scenes by a realism, because under its broadest reasonable interpretation, “realism” refers to the practicality of the traffic scene in reality; therefore, a favorable condition proximal to the route indicates a realism of the traffic scene, see Merriam-Webster “realism.” Further, Cella expressly discloses in ¶ 177 that “the vehicle routing system 1492 accounts for the routing preference 14100 of the user 1490 when routing.” Therefore, Cella, taken alone or in combination with Isele, does teach or suggest “accessing preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes.”
For these reasons, the Examiner maintains the prior art rejection for claim 1. Independent claims 8 and 15 recite parallel limitations to those provided in claim 1; therefore, the prior art rejection for claims 8 and 15 are maintained under the same rationale. Dependent claims 2-3, 5-7, 9-10, 12-14, 16-17, and 19-20 depend from the above discussed claims, and their rejection is supported by the rationale provided by the Examiner.
With respect to claims 4, 11, and 18, the Applicant appears to conclude that the claims are allowable due to their dependency from claims 1, 8, and 15. The Applicant further appears to conclude that elements of claims 4, 11, and 18 are not taught or rendered obvious by the prior art of record without providing contrary evidence or further explanation establishing that the reference being relied on would not enable a skilled artisan to produce the recited limitations. Therefore, the Applicant’s arguments are not persuasive. Even so, claims 4, 11, and 18 depend from the above discussed claims, and their rejection is supported by the rationale provided by the Examiner.
For these reasons, the Examiner maintains the prior art rejection. Additional citations from the prior art of record are provided for further clarification.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 8-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Claim 8
Claim 8. A non-transitory computer readable storage medium storing thereon executable instructions that, as a result of being executed by one or more processors of a computer system, cause the computer system to:
access one or more traffic scenes of a traffic model;
access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes;
calculate, using a reward model, a reward value based, at least in part, on the preference data;
update the traffic model using the reward value; and
cause an autonomous machine to perform one or more actions based on the updated traffic model.
101 Analysis - Step 1: Statutory category – Yes
The claim recites a method including at least one step. The claim falls within one of the four statutory categories. See MPEP 2106.03.
101 Analysis - Step 2A Prong one evaluation: Judicial Exception – Yes – Mental processes
In Step 2A, Prong one of the 2019 Patent Eligibility Guidance (PEG), a claim is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the limitations can be “performed in the human mind, or by a human using a pen and paper”. See MPEP 2106.04(a)(2)(III)
The claim recites the limitation of calculate, using a reward model, a reward value based, at least in part, on the preference data. This limitation, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim elements precludes the step from practically being performed in the mind. For example, the claim encompasses a person looking at data collected and forming a simple judgement. Specifically, the person would mentally, or with the aid of pen and paper, perform calculations of a reward value based on preference information.
Thus, the claim recites a mental process.
101 Analysis - Step 2A Prong two evaluation: Practical Application - No
In Step 2A, Prong two of the 2019 PEG, a claim is to be evaluated whether, as a whole, it integrates the recited judicial exception into a practical application. As noted in MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The courts have indicated that additional elements such as: merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
The Office submits that the foregoing underlined limitation(s) recite additional elements that do not integrate the recited judicial exception into a practical application.
The claim recites additional elements or steps of access one or more traffic scenes of a traffic model; access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes; update the traffic model using the reward value; and cause an autonomous machine to perform one or more actions based on the updated traffic model.
The accessing steps are recited at a high level of generality (i.e. as a general means of obtaining data), and amount to mere data gathering, which is a form of insignificant extra-solution activity. The updating step is also recited at a high level of generality (i.e. as a general means of generating data), and amounts to mere post solution data output, which is a form of insignificant extra-solution activity.
Additionally, the recited step to “cause an autonomous machine to perform one or more actions based on the updated traffic model” is insignificant extra-solution activity describing an intended result of the mental process steps and does not positively recite controlling the machine to perform an action, and further merely uses an existing traffic model to apply the step to the computing environment.
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
101 Analysis - Step 2B evaluation: Inventive concept - No
In Step 2B of the 2019 PEG, a claim is to be evaluated as to whether the claim, as a whole, amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim. See MPEP 2106.05.
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the receiving steps and the displaying step were considered to be insignificant extra-solution activity in Step 2A, and thus they are re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The background recites that the sensors are all conventional sensors mounted on the vehicle, and the specification does not provide any indication that the vehicle controller is anything other than a conventional computer within a vehicle. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function. Accordingly, a conclusion that the collecting step is well-understood, routine, conventional activity is supported under Berkheimer.
Thus, the claim is ineligible.
Claim 15
Claim 15. A system comprising:
one or more processors to:
access one or more traffic scenes of a traffic model;
access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes;
calculate, using a reward model, a reward value, using one or more neural network, based, at least in part, on the preference data;
update the traffic model using the reward value; and
cause an autonomous machine to perform one or more actions based on the updated traffic model.
101 Analysis - Step 1: Statutory category – Yes
The claim recites a method including at least one step. The claim falls within one of the four statutory categories. See MPEP 2106.03.
101 Analysis - Step 2A Prong one evaluation: Judicial Exception – Yes – Mental processes
In Step 2A, Prong one of the 2019 Patent Eligibility Guidance (PEG), a claim is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the limitations can be “performed in the human mind, or by a human using a pen and paper”. See MPEP 2106.04(a)(2)(III)
The claim recites the limitation of calculate, using a reward model, a reward value, based, at least in part, on the preference data. This limitation, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim elements precludes the step from practically being performed in the mind. For example, the claim encompasses a person looking at data collected and forming a simple judgement. Specifically, the person would mentally, or with the aid of pen and paper, perform calculations of a reward value based on preference information.
Thus, the claim recites a mental process.
101 Analysis - Step 2A Prong two evaluation: Practical Application - No
In Step 2A, Prong two of the 2019 PEG, a claim is to be evaluated whether, as a whole, it integrates the recited judicial exception into a practical application. As noted in MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the judicial exception. The courts have indicated that additional elements such as: merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
The Office submits that the foregoing underlined limitation(s) recite additional elements that do not integrate the recited judicial exception into a practical application.
The claim recites additional elements or steps of one or more processors to: access one or more traffic scenes of a traffic model; access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes; and update the traffic model using the reward value; and cause an autonomous machine to perform one or more actions based on the updated traffic model.
The accessing steps are recited at a high level of generality (i.e. as a general means of obtaining data), and amount to mere data gathering, which is a form of insignificant extra-solution activity. The updating step is also recited at a high level of generality (i.e. as a general means of generating data), and amounts to mere post solution data output, which is a form of insignificant extra-solution activity.
Additionally, the recited step to “cause an autonomous machine to perform one or more actions based on the updated traffic model” is insignificant extra-solution activity describing an intended result of the mental process steps and does not positively recite controlling the machine to perform an action, and further merely uses an existing traffic model to apply the step to the computing environment.
Similarly, the “one or more processors” and the “one or more neural network” merely describes how to generally “apply” the otherwise mental judgements using a generic or general-purpose processing environment, i.e. a computer. The “one or more processors” and the “one or more neural network” are generic computer components that are recited at a high level of generality and merely automate the calculating step.
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
101 Analysis - Step 2B evaluation: Inventive concept - No
In Step 2B of the 2019 PEG, a claim is to be evaluated as to whether the claim, as a whole, amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim. See MPEP 2106.05.
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the receiving steps and the displaying step were considered to be insignificant extra-solution activity in Step 2A, and thus they are re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The background recites that the sensors are all conventional sensors mounted on the vehicle, and the specification does not provide any indication that the vehicle controller is anything other than a conventional computer within a vehicle. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function. Accordingly, a conclusion that the collecting step is well-understood, routine, conventional activity is supported under Berkheimer.
Thus, the claim is ineligible.
Dependent Claims
Dependent claims 9-14 and 16-20 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of the dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 9-14 and 16-20 are not patent eligible under the same rationale as provided for in the rejection of independent claims 8 and 15.
Therefore, claims 8-20 are ineligible under 35 USC §101.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5-10, 12-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Cella et al. (U.S. Patent Publication Number 2023/0058169) in view of Isele (U.S. Patent Publication Number 2020/0391738).
Regarding claim 1, Cella discloses a computer-implemented method comprising:
accessing one or more traffic scenes of a traffic model; (Cella ¶ 25 discloses “use of,” therefore accessing, traffic models including “a trigger-response mobile-element-following traffic model” and a “microscopic traffic model,” the traffic model including representation of “an aspect of an environment [i.e., a traffic scene].” The driving environment is modeled using traffic information, see ¶ 487, therefore, the environment includes a traffic scene.)
accessing preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes; (Cella ¶ 208 discloses processing social data sources 22107, thereby accessing the data, and “predicting a high level of attendance [i.e., a realism of a traffic scene] by processing images on many social media feeds that indicate interest in the event by many people [i.e., preference data], prediction of traffic.” Cella ¶ 463 further discloses “a search result display ranking circuit 58210 that orders the search results based on a relevance” such that “in-vehicle results are ranked based on outcomes with respect to in-vehicle searches by other users,” see ¶ 453, wherein the simulations for the vehicle are ranked based on one or more metrics defined by the user, see ¶ 542. Also see ¶ 177 “the vehicle routing system 1492 accounts for the routing preference 14100 of the user 1490 when routing”)
One having ordinary skill in the art would recognize that a realism of a traffic scene, under its broadest reasonable interpretation, refers to the practicality of the traffic scene in reality; therefore, a predicted high level of attendance includes a high practicality for predicted traffic, thereby indicating a realism of the traffic scene, see Merriam-Webster “realism.”
While Cella does not expressly disclose:
calculating, using a reward model, a reward value based, at least in part, on the preference data;
updating the traffic model using the reward value; and
moving an autonomous vehicle based on updating the traffic model.
Isele discloses:
calculating, using a reward model, a reward value based, at least in part, on the preference data; (Isele ¶ 36 discloses “the intention predictor 154 and model updater 156 [i.e., reward model] ... may be utilized to calculate probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented by the autonomous action selector 158,” wherein the action or maneuver is selected based on “user preference,” see ¶ 53. The intention predictor with the model updater is a reward model in accordance with ¶ 80 of the instant specification defining the reward model as being used to compute a reward score.)
updating the traffic model using the reward value; and (Isele ¶ 63 discloses “The implementing 514 the maneuver may include acting, observing, and updating the probability models.”)
moving an autonomous vehicle based on updating the traffic model. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have utilized the user preference data of Isele in place of the data indicating preference of Cella with reasonable expectation of success because the substitution would result in calculating a reward based on the preference data of Cella and updating the traffic model of Cella using the reward value.
Further, it would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the remote control of the autonomous vehicle of Cella with moving the autonomous vehicle based on an updated traffic model, as disclosed in Isele, with reasonable expectation of success to ensure a safe planning strategy (Isele ¶ 52) and to have an updated probability of a successful interaction indicative of a likelihood of success (Isele ¶ 54), rendering the modification to be obvious.
Regarding claim 2, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
the preference data is generated based, at least in part, on one or more sets of one or more traffic scenarios, (Cella ¶ 208 discloses “predicting a high level of attendance by processing images on many social media feeds that indicate interest in the event by many people [i.e., preference data], prediction of traffic,” wherein parameters for the AI system include “traffic profiles 440 (location, direction, density, and patterns in time),” see ¶ 129.)
where the one or more traffic scenarios are ranked by one or more human labelers. (Cella ¶ 543 “The one or more model interpretability systems may also be used by a human user to improve and guide training of the machine learning model 65102, to help debug the machine learning model 65102, to help recognize bias in the machine learning model 65102,” wherein the model 65102 evaluates simulations and ranks the simulations based on one or more metrics.)
Regarding claim 3, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
the preference data comprises one or more pairs of traffic scenarios (Cella ¶ 156 discloses that “each evaluation in the series of evaluations uses feedback indicative of an effect on at least one of a vehicle operating state 945,” such that “a vehicle routing system 1692 to use the routing preference 16100 of the user 1690 and the corresponding effect on the at least one routing parameter to govern routing of the set of vehicles 1694.” The corresponding “effect” indicates a pair of traffic scenarios in accordance with the definition provided in ¶ 87 of the instant application which states “Scenario pairs 210 may be used to generate one or more feedback relationships 212.”)
where each pair of traffic scenarios in the one or more pairs of traffic scenarios comprises a first traffic scenario indicated as preferable over a second traffic scenario. (Cella ¶ 177 discloses “the vehicle routing system 1492 accounts for the routing preference 14100 of the user 1490 when routing the at least one vehicle 1410 within the set of vehicles 1494,” wherein the route includes a plurality of routes which includes at least a second scenario [i.e., preference of a first traffic scenario over a second]. Also see ¶ 156.)
Regarding claim 5, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
the traffic model comprises one or more neural networks. (Cella ¶ 504 discloses “the artificial intelligence system 60112 may train models, such as predictive models (e.g., various types of neural networks”). One having ordinary skill in the art would recognize that a neural network being an example of a model indicates that the model comprises a neural network.)
Regarding claim 6, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
the one or more traffic scenes of the traffic model comprise one or more traffic scenarios indicating a position, direction, and speed of one or more vehicles over an interval of time. (Cella ¶ 129 discloses “Parameters 430 may include parameters of various transportation-relevant profiles, such as traffic profiles 440 (location, direction, density and patterns in time,” such that the parameters are “taken as inputs by an ... AI system,” also see Fig. 4.)
Regarding claim 7, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
moving the autonomous vehicle comprises using the updated traffic model to determine one or more control inputs to the autonomous vehicle to cause the autonomous vehicle to navigate an environment. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5. Also see ¶ 36 “model updater 156 ... may be utilized to calculate probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented by the autonomous action selector 158.” Also see Fig. 4.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the remote control of the autonomous vehicle of Cella with moving the autonomous vehicle based on an updated traffic model, as disclosed in Isele, with reasonable expectation of success to ensure a safe planning strategy (Isele ¶ 52) and to have an updated probability of a successful interaction indicative of a likelihood of success (Isele ¶ 54), rendering the modification to be obvious.
Regarding claim 8, Cella discloses a non-transitory computer readable storage medium storing thereon executable instructions (Cella in at least ¶ 745) that, as a result of being executed by one or more processors of a computer system, cause the computer system to:
access one or more traffic scenes of a traffic model; (Cella ¶ 25 discloses “use of,” therefore accessing, traffic models including “a trigger-response mobile-element-following traffic model” and a “microscopic traffic model,” the traffic model including representation of “an aspect of an environment [i.e., a traffic scene].” The driving environment is modeled using traffic information, see ¶ 487, therefore, the environment includes a traffic scene.)
access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes; (Cella ¶ 208 discloses processing social data sources 22107, thereby accessing the data, and “predicting a high level of attendance [i.e., a realism of a traffic scene] by processing images on many social media feeds that indicate interest in the event by many people [i.e., preference data], prediction of traffic.” Cella ¶ 463 further discloses “a search result display ranking circuit 58210 that orders the search results based on a relevance” such that “in-vehicle results are ranked based on outcomes with respect to in-vehicle searches by other users,” see ¶ 453, wherein the simulations for the vehicle are ranked based on one or more metrics defined by the user, see ¶ 542. Also see ¶ 177 “the vehicle routing system 1492 accounts for the routing preference 14100 of the user 1490 when routing”)
While Cella does not expressly disclose:
calculate, using a reward model, a reward value based, at least in part, on the preference data;
updating the traffic model using the reward value; and
cause an autonomous machine to perform one or more actions based on the updated traffic model.
Isele discloses:
calculate, using a reward model, a reward value based, at least in part, on the preference data; (Isele ¶ 36 discloses “the intention predictor 154 and model updater 156 [i.e., reward model] ... may be utilized to calculate probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented by the autonomous action selector 158,” wherein the action or maneuver is selected based on “user preference,” see ¶ 53. The intention predictor with the model updater is a reward model in accordance with ¶ 80 of the instant specification defining the reward model as being used to compute a reward score.)
updating the traffic model using the reward value; and (Isele ¶ 63 discloses “The implementing 514 the maneuver may include acting, observing, and updating the probability models.”)
cause an autonomous machine to perform one or more actions based on the updated traffic model. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the traffic model of Cella with updating the traffic model using a reward value, as disclosed in Isele, with reasonable expectation of success to enable reasoning about which parts of the road are available and when (Isele ¶ 37) while finding the optimal action sequences for a greatest expected return (Isele ¶ 31), rendering the modification to be obvious.
Regarding claim 9, Cella in combination with Isele discloses the parallel limitations contained in parent claim 2 for the reasons discussed above. In addition, the combination of Cella and Isele discloses a non-transitory computer readable storage medium. (See Cella in at least ¶ 745)
Regarding claim 10, Cella in combination with Isele discloses the parallel limitations contained in parent claim 3 for the reasons discussed above. In addition, the combination of Cella and Isele discloses a non-transitory computer readable storage medium. (See Cella in at least ¶ 745)
Regarding claim 12, Cella in combination with Isele discloses the parallel limitations contained in parent claim 5 for the reasons discussed above. In addition, the combination of Cella and Isele discloses a non-transitory computer readable storage medium. (See Cella in at least ¶ 745)
Regarding claim 13, Cella in combination with Isele discloses the parallel limitations contained in parent claim 6 for the reasons discussed above. In addition, the combination of Cella and Isele discloses a non-transitory computer readable storage medium. (See Cella in at least ¶ 745)
Regarding claim 9, Cella in combination with Isele discloses the parallel limitations contained in parent claim 2 for the reasons discussed above. In addition, the combination of Cella and Isele discloses a non-transitory computer readable storage medium. (See Cella in at least ¶ 745)
Regarding claim 14, Cella in combination with Isele discloses the non-transitory computer readable storage medium of claim 8, wherein:
the computer system is to further cause an autonomous vehicle to navigate an environment based, at least in part on, updating the traffic model. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5. Also see ¶ 36 “model updater 156 ... may be utilized to calculate probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented by the autonomous action selector 158.” Also see Fig. 4.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the remote control of the autonomous vehicle of Cella with moving the autonomous vehicle based on an updated traffic model, as disclosed in Isele, with reasonable expectation of success to ensure a safe planning strategy (Isele ¶ 52) and to have an updated probability of a successful interaction indicative of a likelihood of success (Isele ¶ 54), rendering the modification to be obvious.
Regarding claim 15, Cella discloses a system comprising:
one or more processors to: access one or more traffic scenes of a traffic model; (Cella ¶ 25 discloses “use of,” therefore accessing, traffic models including “a trigger-response mobile-element-following traffic model” and a “microscopic traffic model,” the traffic model including representation of “an aspect of an environment [i.e., a traffic scene].” The driving environment is modeled using traffic information, see ¶ 487, therefore, the environment includes a traffic scene.)
access preference data indicating a ranking, by one or more users, of a realism of the one or more traffic scenes; (Cella ¶ 208 discloses processing social data sources 22107, thereby accessing the data, and “predicting a high level of attendance [i.e., a realism of a traffic scene] by processing images on many social media feeds that indicate interest in the event by many people [i.e., preference data], prediction of traffic.” Cella ¶ 463 further discloses “a search result display ranking circuit 58210 that orders the search results based on a relevance” such that “in-vehicle results are ranked based on outcomes with respect to in-vehicle searches by other users,” see ¶ 453, wherein the simulations for the vehicle are ranked based on one or more metrics defined by the user, see ¶ 542. Also see ¶ 177 “the vehicle routing system 1492 accounts for the routing preference 14100 of the user 1490 when routing”)
calculate, using a reward model, a reward value, using one or more neural network ... (Cella ¶ 126 discloses that the AI system, which includes “one or more neural networks,” may manage a set of rewards.)
While Cella does not expressly disclose:
calculate, using a reward model, a reward value, using one or more neural network, based, at least in part, on the preference data;
update the traffic model using the reward value; and
cause an autonomous machine to perform one or more actions based on the updated traffic model.
Isele discloses:
calculate, using a reward model, a reward value, ... , based, at least in part, on the preference data; (Isele ¶ 36 discloses “the intention predictor 154 and model updater 156 ... [calculates] probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented by the autonomous action selector 158,” wherein the action or maneuver is selected based on “user preference,” see ¶ 53, and “The model updater 156 may calculate an updated probability,” see ¶ 54.)
update the traffic model using the reward value; and (Isele ¶ 63 discloses “The implementing 514 the maneuver may include acting, observing, and updating the probability models.”)
cause an autonomous machine to perform one or more actions based on the updated traffic model. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have utilized the neural network of Cella to calculate the reward value of calculated in Isele because the substitution would result in calculating a reward using one or more neural networks based on the preference data of Cella and updating the traffic model of Cella using the reward value.
Further, it would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the AI system of Cella with updating the traffic model, as disclosed in Isele, with reasonable expectation of success to ensure a safe planning strategy (Isele ¶ 52) and to have an updated probability of a successful interaction indicative of a likelihood of success (Isele ¶ 54), rendering the modification to be obvious.
Regarding claim 16, Cella in combination with Isele discloses the parallel limitations contained in parent claim 2 for the reasons discussed above.
Regarding claim 17, Cella in combination with Isele discloses the parallel limitations contained in parent claim 3 for the reasons discussed above.
Regarding claim 19, Cella in combination with Isele discloses the system of claim 15, wherein:
the one or more processors are further to cause an autonomous machine to navigate an environment based, at least in part on, using the updated traffic model to determine one or more control inputs to the autonomous machine. (Isele ¶ 57 discloses “The autonomous action selector 158 may implement the maneuver based on the updated probability of the successful interaction between the identified traffic participant and the autonomous vehicle,” such that “traffic participants may be modelled according to an intelligent driver model (IDM),” see ¶ 58. See Fig. 5. Also see ¶ 36 “model updater 156 ... may be utilized to calculate probabilities and rewards associated with those various possible actions (i.e., of the identified traffic participant and of the autonomous vehicle) to determine the action or operating maneuver to be implemented [i.e., determine one or more control inputs] by the autonomous action selector 158 [i.e., autonomous machine].” Also see Fig. 4.)
It would have been obvious to a person having ordinary skill in the art before the effective filing date to have combined the AI system of Cella with updating the traffic model, as disclosed in Isele, with reasonable expectation of success to ensure a safe planning strategy (Isele ¶ 52) and to have an updated probability of a successful interaction indicative of a likelihood of success (Isele ¶ 54), rendering the modification to be obvious.
Regarding claim 20, Cella discloses the system of claim 15, wherein the system is comprised in at least one of:
a control system for an autonomous or semi-autonomous machine; a perception system for an autonomous or semi-autonomous machine; a first system for performing simulation operations; a second system for performing deep learning operations; a third system implemented using an edge device; a fourth system implemented using a robot; a fifth system incorporating one or more virtual machines (VMs); a sixth system implemented at least partially in a data center; a seventh system for performing digital twin operations; an eighth system for performing light transport simulation; a nineth system for performing collaborative content creation for 3D assets; a tenth system for performing conversational Artificial Intelligence operations; an eleventh system for generating synthetic data; a twelfth system for implementing a web-hosted service for detecting program workload inefficiencies; an application as an application programming interface ("API"); a thirteenth system implemented at least partially using cloud computing resources; a fourteenth system for presenting one or more of virtual reality content, augmented reality content, or mixed reality content; or a fifteenth system implementing one or more large language models (LLMs). (Cella ¶ 153 discloses “the vehicle comprises an artificial intelligence system 1036, the method further comprising automating at least one control parameter of the vehicle by the artificial intelligence system 1036. In embodiments, the vehicle 1010 is at least a semi-autonomous vehicle”)
Claims 4, 11, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Cella et al. (U.S. Patent Publication Number 2023/0058169) in view of Isele (U.S. Patent Publication Number 2020/0391738), further in view of Qi et al. (U.S. Patent Publication Number 2021/0081787).
Regarding claim 4, Cella in combination with Isele discloses the computer-implemented method of claim 1, wherein:
the reward value is calculated by comparing two or more traffic scenarios associated with the preference data ... (Cella ¶ 430 “a preferred outcome of maximum safety. In such a case, the interface 56133 may provide a reward parameter to a model or expert system 5657”)
While Cella in combination with Isele does not expressly disclose:
[a reward value] to determine an average loss value over a sequence of the two or more traffic scenarios.
Qi discloses:
[a reward value] to determine an average loss value over a sequence of the two or more traffic scenarios. (Qi ¶ 19 discloses “the loss function is a function for calculating a stun of a first t