DETAILED ACTION
Status of Claims
Claims 1-7, 7-9, 11-14, and 17-19 are currently pending and have been examined in this application. This action is NON-FINAL.
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/31/2025 has been entered.
Response to Arguments
Applicant's arguments filed 10/31/2025 have been fully considered but they are not persuasive.
Applicant argues:
Regarding the 35 USC 103 rejection, “For the sake of argument and completeness, Lockwood's drive line 146 is revised by vehicle 102 based on its selected revised trajectory (which was generated by vehicle 102 based on the altered size of the overlaid shape 140). However, Lockwood's drive line 146 is not revised based on a teleoperation input as defined in claim 1, i.e., a teleoperation input formed by combining and synthesizing inputs entered into the teleoperation system by a teleoperator mimicking direct control of a vehicle by a driver, wherein the inputs comprise a steering input to alter a steering angle of the vehicle, a throttle input to change a throttle position for the vehicle, and/or a brake input to change a brake position for the vehicle.” (Remarks, pg. 8)
Regarding the 35 USC 103 rejection, “On page 8 of the Office Action, the Examiner states that the combination of Lockwood and Pierfelice do not teach the teleoperation input of claim 1. Akbarzadeh was not cited in this regard. However, Gogna was cited as curing this deficiency of the Lockwood/Pierfelice combination. The Examiner cites paragraph [0022] of Gogna as teaching that a teleoperator may provide an input to the user interface via a steering wheel, a brake pedal, and/or an acceleration pedal. However, Applicant asserts that there is no motivation to modify Lockwood's solution by replacing its teleoperation input for expanding the size of the overlaid shape 140 to comprise Gogna's teleoperation input since such modification would unnecessarily add complexity and computational intensity to (as well as require an extensive redesign of) Lockwood's solution. Applicant asserts that a conclusion otherwise uses impermissible hindsight reasoning.” (Remarks, pg. 8)
Examiner respectfully disagrees.
Regarding point (a), Gogna teaches gathering inputs from a teleoperator that mimic direct control of the vehicle (Gogna, para. [0022] “In some examples, the controls may comprise a steering control, a braking control, and/or an accelerations control, and one or more of these controls may be configured to respond to real-time changes in the environment of the vehicle. For example, a teleoperator may provide an input to the user interface via one or more controls (e.g., a steering wheel, a brake pedal, an acceleration pedal, a hand-held controller, or the like).”) and then a planner generates trajectories for the vehicle based on the teleoperator inputs that determines the controls used for a vehicle (Gogna, para. [0090] “For instance, the guidance component 544 can be configured to generate and/or determine a potential trajectory (e.g., reference trajectory 306 in FIG. 3) for the vehicle 502 based at least in part on one or more inputs received from the user interface 540. In various examples, the instruction may comprise an indication of a velocity and a direction usable by the planning component 532 to generate one or more predicted trajectories for the vehicle 502 (e.g., direction of travel, speed, etc.).”) In other words, Gogna teaches combining the teleoperator steering, braking, and acceleration inputs from controls that mimic direct control of a vehicle from a steering wheel, brake pedal, and acceleration pedal in order to generate a trajectory for the vehicle and determine commands for the movement of the vehicle based on the trajectory. Gogna teaches a teleoperation input formed by combining and synthesizing inputs entered into the teleoperation system by a teleoperator mimicking direct control of a vehicle by a driver because Gogna teaches using input from a remote steering wheel, brake pedal, and acceleration pedal to determine a potential trajectory used to control the vehicle.
Regarding point (b), Lockwood and Gogna are both in the same field of using inputs from a teleoperator to alter the trajectory of an autonomous vehicle, altering the mode of input from selection on a computer to controls that mimic driving controls would not change Lockwood’s function of using the teleoperator input to control the autonomous vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the combination of Lockwood and Pierfelice with Gogna to include a steering wheel, brake pedal, and acceleration pedal as teleoperation controls in order to provide more realistic controls and receive inputs that better capture the intent of the human interacting with the controls (Gogna, para. [0022]).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 7-9, 11-14, and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Lockwood et al. (US 20190011912 A1), in view of Pierfelice (US 20140005925 A1), in further view of Gogna (US 20210323573 A1), and in further view of Akbarzadeh et al. (US 20230296758 A1).
Regarding claim 1,
Lockwood teaches:
A method of controlling an autonomous vehicle, comprising:
receiving, at the autonomous vehicle, a teleoperation input from a teleoperation system through a communication link, wherein the teleoperation input
(Lockwood – [0066] “As shown in FIG. 2, the example vehicle systems 202 also include a network interface 234 configured to provide a communication link between the vehicle 102 and the teleoperations system 148. For example, the network interface 234 may be configured to allow data to be exchanged between the vehicle 102, other devices coupled to a network, such as other computer systems, other vehicles 102 in the fleet of vehicles, and/or with the teleoperations system 148.” [0110] “In some examples, the teleoperations system 148 may send teleoperations signals via the teleoperations transmitter 306, to the vehicles 102A-102C to provide guidance to the respective vehicle controllers 228 of the vehicles 102A-102C to switch from the first operating mode to the second operating mode while operating in the respective second geographic areas. In some examples, the second operating parameters may include one or more of altered performance parameters (e.g., speed, acceleration, braking rates, and steering input rates), altered vehicle operation policies (e.g., safety-related guidelines for controlling the vehicle), altered vehicle operation laws, or vehicle operation regulations.”)
using a processor of the autonomous vehicle:
(Lockwood – [0049] “In various implementations, the architecture 200 may be implemented using a uniprocessor system including one processor, or a multiprocessor system including several processors (e.g., two, four, eight, or another suitable number). The processor(s) may be any suitable processor capable of executing instructions.”)
generating an updated map based on the received teleoperation input from the teleoperation system, (ii) a current position and a current orientation of the autonomous vehicle and (iii) an existing trajectory of the autonomous vehicle,
(Lockwood – [0070] “In some examples, the sensor data may include raw sensor data or processed sensor data, and the road network data may include data related to a global or local map of an area associated with operation of the vehicle 102. In some examples, the communication signals may include data associated with the current status of the vehicle 102 and its systems, such as, for example, its current position, current speed, current path and/or trajectory, current occupancy, the level of charge of one or more of its batteries, and/or the operational status of its sensors and drive systems.” [0083] “The teleoperator 150 may use the active view zone 522 to monitor the operation of, and the teleoperator's interaction with, the selected vehicle 102 (i.e., AV 001 in this example) before, during, and/or after the teleoperator 150 interacts with the vehicle 102 via the teleoperator interface 154. For example, the vehicle 102 may send communications signals to the teleoperator system 148 including sensor signals from one or more sensors associated with the vehicle 102 and/or a request for guidance and/or information from the teleoperations system 148. Based at least in part on the communications signals, the active view zone 522 provides a real-time perspective view of the vehicle 102 and the relevant environment. In some examples, the active view zone 522 may display any permutation of sensor data, operation state data, and/or teleoperations data.”
Examiner note: The data displayed in the active view zone 522, as shown in fig. 5B corresponds to the updated map data of an environment.)
wherein the updated map provides a new path for the autonomous vehicle to traverse for avoiding an event or road condition;
(Lockwood – [0094] “In some examples, the teleoperations interface 154 may be configured to permit the teleoperator 150 to advise the teleoperations system 148 and/or other vehicles 102 of the fleet 302 about the object 524 in the road 106. For example, the teleoperations interface 154 may facilitate identification of the location and information associated with the object 524 (e.g., its classification and/or whether it is static or dynamic) for use by the teleoperations system 148 and/or other vehicles 102 of the fleet 302. This information may result in vehicles 102 avoiding the area associated with the object 524 or may provide guidance for vehicles 102 that encounter the object 524 and/or teleoperators assisting vehicles 102 as they encounter the object 524.”)
providing the updated map to a planner of the autonomous vehicle;
(Lockwood – [0056] “In some examples, the planner 214 may be configured to generate data representative of a trajectory of the vehicle 102, for example, using data representing a location of the vehicle 102 in the environment 100 and other data, such as local pose data, that may be included in the location data 212. In some examples, the planner 214 may also be configured to determine projected trajectories predicted to be executed by the vehicle 102. The planner 214 may, in some examples, be configured to calculate data associated with a predicted motion of an object in the environment 100, and may determine a predicted object path associated with the predicted motion of the object. In some examples, the object path may include the predicted object path.”)
determining, by the planner, a modified trajectory or a new trajectory for the autonomous vehicle based at least in part on the updated map of the environment comprising a speed limit determined
(Lockwood – [0035] “The second operating parameters may include one or more of reducing energy expenditure of the driverless vehicles, setting a maximum operating speed, preventing the driverless vehicles from operating bidirectionally, changing a threshold confidence level required for autonomous operation, changing a threshold confidence level required for autonomous operation in a second geographic area, altering at least one of an object classification model or an object prediction model used by the driverless vehicles, or relaxing vehicle operation policies associated with complying with traffic laws and regulations.” [0045] “In some examples, the virtual boundaries 140 of the driving corridor 138 may be determined based at least in part on sensor data received from sensors associated with the vehicle 102 and/or road network data received by the vehicle 102 via a road network data store, as explained in more detail herein. Though not illustrated in FIG. 1, such sensor data indicative of objects may be represented in such a corridor as indented or removed portions.” [0092] “In the example shown, the vehicle 102 may expand the boundaries 140 of its driving corridor 138 in a manner consistent with the teleoperations signals, for example, as shown in FIG. 5B. Upon expansion of the driving corridor 138, the vehicle 102 may generate, for example, via the vehicle systems 202 (FIG. 2), a plurality of revised trajectories (e.g., concurrently or substantially simultaneously within technical capabilities) based at least in part on the altered boundaries 140 of the driving corridor 138. In the example shown, the alert bar 536 displays a revised confidence level (“System Confidence 95%) that is above the threshold confidence level. In some examples, the vehicle 102 may calculate a confidence level for each of the revised trajectories, and the vehicle 102 may select a revised trajectory having the highest confidence level from among the plurality of revised trajectories.”)
controlling the autonomous vehicle according to the modified trajectory or the new trajectory.
(Lockwood – [0092] “Based at least in part on the selected revised trajectory, the vehicle 102 may determine a revised drive line 146 for use in maneuvering around the object 524. Thereafter, the vehicle controller 228 (FIG. 2) may be configured to operate the vehicle 102 according to the revised drive line 146, for example, as shown in FIG. 5B, and maneuver around the object 524.”)
Lockwood does not explicitly teach the following limitation, however, Pierfelice teaches:
the
(Pierfelice – [0032] “The known speed limits can be associated with the map data and stored in one of the memory devices. In alternative embodiments, the average speed limit can be determined by averaging the current speed of other vehicles traveling along the route 122, which can be detected by speed sensing systems and provided in real time via the network interface hardware 118 (FIG. 1).”)
Lockwood and Pierfelice are both considered to be analogous to the claimed invention because they are both in the same field of providing route guidance to a vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify Lockwood to include calculating speed limit based on throttle input as taught by Pierfelice in order to determine an optimal route based on factors such as traffic, road conditions, distance, and speed limits (Pierfelice para. [0030]).
The combination of Lockwood and Pierfelice does not explicitly teach the following limitation, however, Gogna teaches:
wherein the teleoperation input is formed by combining and synthesizing inputs entered into the teleoperation system by a teleoperator mimicking direct control of a vehicle by a driver, and wherein the inputs comprise a steering input to alter a steering angle of the vehicle, a throttle input to change a throttle position for the vehicle, and/or a brake input to change a brake position for the vehicle; and
(Gogna – [0022] “In various examples, a user interface of a computing device provides controls to maneuver a representation of the vehicle in a simulated environment displayed in the user interface. In some examples, the controls may comprise a steering control, a braking control, and/or an accelerations control, and one or more of these controls may be configured to respond to real-time changes in the environment of the vehicle. For example, a teleoperator may provide an input to the user interface via one or more controls (e.g., a steering wheel, a brake pedal, an acceleration pedal, a hand-held controller, or the like). By providing more realistic controls, the user interface may receive input(s) that better capture an intent of a human interacting with the user interface, and represent that intent in the instruction sent to the vehicle. A vehicle computing device of the vehicle may, in some examples, perform planning considerations based on the input(s) captured using the aforementioned controls.” [0090] “For instance, the guidance component 544 can be configured to generate and/or determine a potential trajectory (e.g., reference trajectory 306 in FIG. 3) for the vehicle 502 based at least in part on one or more inputs received from the user interface 540. In various examples, the instruction may comprise an indication of a velocity and a direction usable by the planning component 532 to generate one or more predicted trajectories for the vehicle 502 (e.g., direction of travel, speed, etc.).”)
Gogna is considered to be analogous to the claimed invention because it is in the same field of controlling the teleoperations of a vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the combination of Lockwood and Pierfelice with Gogna to include a steering wheel, brake pedal, and acceleration pedal as teleoperation controls in order to provide more realistic controls and receive inputs that better capture the intent of the human interacting with the controls (Gogna para. [0022]).
The combination of Lockwood, Pierfelice, and Gogna does not explicitly teach the following, however, Akbarzadeh teaches:
filtering sensor data, from one or more sensors of the autonomous vehicle, to remove one or more objects that match a tracked high precision object;
(Akbarzadeh – [0057] “The dynamic object filter 112 may be configured to identify and remove one or more portions of the RADAR data 102 that may correspond to dynamic objects. For example, the RADAR data 102 may include indications based on any suitable technique as to whether detected objects were moving at the time that a corresponding RADAR scan was performed. The dynamic object filter 112 may be configured to identify such objects as being dynamic objects and may be configured to remove points from the RADAR data 102 that correspond to the dynamic objects.” [0058] “In these or other embodiments, the dynamic object filter 112 may be configured to perform object tracking between multiple scan data sets 108 and may be configured to identify dynamic objects based on the object tracking…In some embodiments, the dynamic object filter 112 may be configured to use spatial transformations between scan data sets 108 to determine whether objects in multiple scan data sets 108 have moved between the scan data sets 108. The dynamic object filter 112 may be configured to remove points from the RADAR data that correspond to objects determined to have moved between scans.”)
determining, by the planner, a modified trajectory or a new trajectory for the autonomous vehicle… by incorporating the filtered sensor data and environmental data associated with a physical environment of the autonomous vehicle; and
(Akbarzadeh – [0231] “At block B1310, RADAR map data may be generated based on the decompressed RADAR data packet and one or more other RADAR data packets. The RADAR map data generation may be performed according to any applicable description described in the present disclosure.” [0239] “One or more of the controller(s) 1436 may receive inputs (e.g., represented by input data) from an instrument cluster 1432 of the vehicle 1400 and provide outputs (e.g., represented by output data, display data, etc.) via a human-machine interface (HMI) display 1434, an audible annunciator, a loudspeaker, and/or via other components of the vehicle 1400. The outputs may include information such as vehicle velocity, speed, time, map data (e.g., the HD map 1422 of FIG. 14C), location data (e.g., the location of the vehicle 1400, such as on a map), direction, location of other vehicles (e.g., an occupancy grid), information about objects and status of objects as perceived by the controller(s) 1436, etc. For example, the HMI display 1434 may display information about the presence of one or more objects (e.g., a street sign, caution sign, traffic light changing, etc.), and/or information about driving maneuvers the vehicle has made, is making, or will make (e.g., changing lanes now, taking exit 34B in two miles, etc.).”
Examiner note: where the filtered RADAR data is used to generate map data which is further used to determine and display information about driving maneuvers the vehicle will make, which corresponds to determining a new trajectory by incorporating filtered sensor data and environmental data.)
Akbarzadeh is considered to be analogous to the claimed invention because it is in the same field of monitoring the surroundings of an autonomous vehicle. It would have been obvious to one skilled in the art before the effective filing date of the claimed invention to modify the combination of Lockwood, Pierfelice, and Gogna with Akbarzadeh to include filtering the sensor data to remove obstacles identified as dynamic in order to improve the accuracy of the maps generated (Akbarzadeh, para. [0205]).
Regarding claim 2,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 1.
Lockwood further teaches:
further comprising determining, by the teleoperation system or by the autonomous vehicle, an event or a condition associated with at least a portion of the existing trajectory.
(Lockwood – [0030] “For example, an event may include one or more of an activity associated with a portion of the path, an object along the path at least partially within the driving corridor as the vehicle approaches the object (e.g., people, animals, vehicles, or other static or dynamic objects) along the path at least partially within the driving corridor or moving with a trajectory toward the driving corridor as the vehicle approaches the object.” [0089] “FIG. 5A shows an example vehicle 102 in a first example event scenario in which the example static object 524 is in the road 106. In some examples, as the vehicle 102 approaches the object 524, the sensors 204 (FIG. 2) associated with the vehicle 102 may detect the object 524. Once detected, one or more of the planner 214, the object data calculator 216, the object classifier 218, the collision predictor system 220, and the kinematics calculator 222 (FIG. 2) may be used to determine the location of the object 524, classify the object 524, determine whether the object 524 is static or dynamic, and if the object is dynamic, predict a possible trajectory of the object 524.”)
Regarding claim 3,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 2.
Lockwood further teaches:
wherein the teleoperation input is entered responsive to the event or condition associated with the at least the portion of the existing trajectory.
(Lockwood – [0026] “In some examples, the teleoperations system may provide guidance and information to a driverless vehicle when the driverless vehicle encounters an event, so that the driverless vehicle will be able to avoid, maneuver around, and/or pass through the area associated with the event. The driverless vehicle may be configured to send communication signals to the remotely located teleoperations system, and based at least in part on the communication signals, the teleoperations system may provide the driverless vehicle with guidance, including instructions, proposed actions or maneuvers for the evaluation and/or execution by the driverless vehicle, and/or information to assist the driverless vehicle past the area associated with the event.”)
Regarding claim 4,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 2.
Lockwood further teaches:
wherein the event or condition comprises a road construction, a weather condition, a stop sign, or a school zone.
(Lockwood – [0111] “Although the example events described with respect to FIG. 10 include accident, school, and construction zones, other geographic location-related zones are contemplated. For example, other events may be associated with flood zones, parade zones, special event zones, and/or zones associated with slow traffic, such as areas where vehicles are being driven into bright sunlight or areas where weather conditions such as rain or snow are affecting traffic rates.”)
Regarding claim 7,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 1.
Lockwood further teaches:
wherein the teleoperation input is based on real time information of the autonomous vehicle.
(Lockwood – [0083] “For example, as shown in FIG. 5A, the view selector icon 518B has been selected, and the UI 500A includes an active view zone 522 providing a real-time simulated (or animated) perspective view of the vehicle 102 selected via the selector 504A. In the example shown, the active view zone 522 shows an animation depicting the vehicle 102 encountering an object 524 in the road 106. The teleoperator 150 may use the active view zone 522 to monitor the operation of, and the teleoperator's interaction with, the selected vehicle 102 (i.e., AV 001 in this example) before, during, and/or after the teleoperator 150 interacts with the vehicle 102 via the teleoperator interface 154. For example, the vehicle 102 may send communications signals to the teleoperator system 148 including sensor signals from one or more sensors associated with the vehicle 102 and/or a request for guidance and/or information from the teleoperations system 148. Based at least in part on the communications signals, the active view zone 522 provides a real-time perspective view of the vehicle 102 and the relevant environment. In some examples, the active view zone 522 may display any permutation of sensor data, operation state data, and/or teleoperations data.”)
Regarding claim 8,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 7.
Lockwood further teaches:
further comprising presenting visualization data that comprises the real time information of the autonomous vehicle on a display to enable a teleoperator to determine whether to intervene in operation of a planner of the autonomous vehicle.
(Lockwood – [0085] “The example UI 500A shown in FIG. 5A also includes a video view zone 528. In some examples, the video view zone 528 may provide a real-time video view from a video camera associated with the vehicle 102. In some examples, any data discussed herein as being “real-time” may additionally or alternatively include real-time data and/or historical data. This may assist the teleoperator 150 with quickly understanding the situation encountered by the vehicle 102.” [0089] “Once detected, one or more of the planner 214, the object data calculator 216, the object classifier 218, the collision predictor system 220, and the kinematics calculator 222 (FIG. 2) may be used to determine the location of the object 524, classify the object 524, determine whether the object 524 is static or dynamic, and if the object is dynamic, predict a possible trajectory of the object 524. As the vehicle 102 approaches the object 524, one or more of these systems may be used to calculate a confidence level associated with a probability that the vehicle 102 will be able to successfully maneuver past the object 524, for example, without assistance from the teleoperations system 148. As the confidence level drops below a threshold minimum confidence level, the vehicle 102 may slow its speed or stop, and use its network interface 234 (FIG. 2) to send communication signals to the teleoperations system 148 providing sensor data and a request for guidance from the teleoperations system 148. In some examples, the request may be inferred and/or determined by the teleoperations system 148 based at least in part on, for example, the sensor data and or other information associated with the vehicle 102, such as its change in speed, confidence level, and/or other maneuvering that might be indicative of a need for guidance from the teleoperations systems 148.”)
Regarding claim 9,
The combination of Lockwood, Pierfelice, Gogna, and Akbarzadeh teaches the limitations of claim 1.
Lockwood further teaches:
wherein the teleoperation input further comprises a modification to classification of an object or obstacle in an environment of the autonomous vehicle or guidance for causing the autonomous vehicle to ignore or avoid the object or obstacle.
(Lockwood – [0032] “In some examples, the teleoperations signals may provide guidance including one or more of causing the driverless vehicle to at least one of ignore the event, increase or decrease probabilities of classes of objects (e.g., in a school zone, increase a probability that a small object is a child), alter virtual boundaries of a driving corridor within which the vehicle operates, and operate the driverless vehicle according to a travel speed constraint (e.g., reducing a maximum travel speed).”)
Regarding claim 11,
Claim 11 recites a system comprising substantially the same limitation as claim 1 above, therefore it is rejected for the same reasons. In addition, Lockwood teaches:
A system for controlling an autonomous vehicle, comprising:
a teleoperation receiver, configured to receive, through a communication link, a teleoperation input from a teleoperation system,
(Lockwood – [0070] “The example fleet 302 includes a plurality of vehicles 102, at least some which are communicatively coupled to the teleoperations system 148, for example, via the respective network interfaces 234 of the vehicles 102, and a teleoperations receiver 304 and a teleoperations transmitter 306 associated with the teleoperations system 148. For example, a vehicle 102 may send communication signals via the network interface 234, which are received by the teleoperations receiver 304.”)
Regarding claim 12,
Claim 12 recites a system comprising substantially the same limitation as claim 2 above, therefore it is rejected for the same reasons.
Regarding claim 13,
Claim 13 recites a system comprising substantially the same limitation as claim 3 above, therefore it is rejected for the same reasons.
Regarding claim 14,
Claim 14 recites a system comprising substantially the same limitation as claim 4 above, therefore it is rejected for the same reasons.
Regarding claim 17,
Claim 17 recites a system comprising substantially the same limitation as claim 7 above, therefore it is rejected for the same reasons.
Regarding claim 18,
Claim 18 recites a system comprising substantially the same limitation as claim 8 above, therefore it is rejected for the same reasons.
Regarding claim 19,
Claim 19 recites a system comprising substantially the same limitation as claim 9 above, therefore it is rejected for the same reasons.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure or directed to the state of the art is listed on the enclosed PTO-892.
The following is a brief description for relevant prior art that was cited but not applied:
Kazemi et al. (US 20190066506 A1) discloses synthesizing across multiple sources of operational information to extract a singular vehicle intention when multiple different control systems exist which have overlapping control responsibilities and priorities.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MELANIE HUBER whose telephone number is (703)756-1765. The examiner can normally be reached M-F 7:30am-4pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JAMES LEE can be reached at (571)-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/M.G.H./Examiner, Art Unit 3668 /JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668