Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
This Office Action is in response to the applicant’s amendments and remarks filed on 11/25/2025.
Claims 1-10 are pending for examination.
Regarding the rejection of claims 1-10 under 35 U.S.C. §103, applicant’s arguments have been considered but are deemed moot in view of the new grounds of rejection necessitated by applicant’s amendment, outlined below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-10 are rejected under 35 U.S.C. 103 as being obvious over Rastoll et al. (US 20200192351 A1) in view of Van Beek et al. (US 20220161815 A1) and Rastoll et al. (US 20200192352 A1) , henceforth known as Rastoll 1, Van Beek, and Rastoll 2, respectively.
Rastoll 1 and Van Beek were first cited in a previous office action. Rastoll 2 was first cited in a previous office action as prior art not relied upon.
Regarding claim 1, Rastoll 1 discloses:
A method for control of a vehicle by an operator, comprising the following steps:
(Rastoll 1, FIG. 3A; ¶[0021], ¶[0023], ¶[0026]: remote operator controls vehicle)
using a predictive map to control the vehicle by:
(Rastoll 1, FIG. 4A; FIG. 7; ¶[0027], ¶[0028], ¶[0045]: NAV map)
detecting a situation and/or location reference of the vehicle;
(Rastoll 1, FIG. 2; FIG. 3A; ¶[0045], ¶[0046]: NAV map, sensors detect impediment)
transmitting data of a defined set of sensors;
(Rastoll 1, ¶[0051]: providing information from vehicle sensors; ¶[0044]: remote operator may use additional sensors; Where vehicle sensors provide information to the remote operator and where the operator can select additional sensors indicating the original set of sensors is a defined set)
fusing and processing the data of the defined set of sensors;
(Rastoll 1, ¶[0028], ¶[0071], ¶[0072]: sensor fusion maps; Where the sensor data is processed into sensor fusion maps)
displaying the fused and processed data for the operator; and
(Rastoll 1, FIG. 3A; FIG. 6; ¶[0015]; ¶[0050], ¶[0051]: vehicle sensors provide description of scene to remote operator)
creating/updating the predictive map by:
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; ¶[0052]: update NAV map)
recognizing a problematic situation and/or a problematic location by: observation of the operator and/or marking by the operator;
(Rastoll 1, FIG. 2; FIG. 3A; FIG. 4A; ¶[0051]: impediment annotated by remote operator)
storing the problematic situation and/or the problematic location in a first database for storing problematic situations and locations; and
(Rastoll 1, FIG. 4A; FIG. 7; FIG. 8; ¶[0052]-¶[0053], ¶[0055], ¶[0057]-¶[0060], ¶[0116]: impediment annotations, location, etc. saved locally and transmitted to database to update NAV maps).
Rastoll 1 is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Van Beek teaches:
training a model for selecting the defined set of sensors and fusing the data of the defined set of sensors by machine learning,
(Van Beek, FIG. 58A; FIG. 60; FIG. 61; FIG. 62B; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; ¶[0432]-¶[0437]: sensor weights may be zero, i.e. not used; ¶[0115]-¶[0116]: supervised learning- similar to using human annotated training data; Where a machine learning algorithm trains a model to adjust vehicle sensor weights based on context and where the model is trained using supervised learning)
wherein the fusion of sensor data includes dynamically adjusting a weighting of individual sensors based on at least one of the location, situation, or predictive map data,
(Van Beek, FIG. 58A; FIGs. 60-62B; ¶[0065]-¶[0068]; ¶[0149]; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; Where each sensor is weighted based on the context, i.e. the situation; for example the LIDAR sensor is weighted more heavily than the camera sensor at night).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 with the features taught by Van Beek because “…non-uniform data sampling may significantly reduce the requirement of resources and the cost of the overall processing pipeline. Instead of sampling data from every sensor at a set interval (e.g., every second), the sampling of one or more sensors may be customized based on context” (Van Beek, ¶[0415]). See also ¶[0416] and ¶[0421].
Rastoll 1 and Van Beek are silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Rastoll 2 teaches:
wherein the operator controls the vehicle remotely without direct line of sight based on pieces of vehicle and surroundings information.
(Rastoll 2, FIG. 3; FIG. 10; ¶[0005]-¶0006]: fusing sensor data to create visual representation of vehicle environment; ¶[0037]-¶[0039]: fused sensor data displayed to remote operator; ¶[0008]: remote operator controls vehicle in response to viewing the visual representation; Where the remote vehicle operator controls the vehicle remotely without using direct sight based on a visual representation of the vehicle and surrounding information; this is similar to the claimed invention as described in FIGs. 2-3 and the last paragraph of page 17 to the first paragraph of page 19 of the specification).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 and Van Beek with the features taught by Rastoll 2 because “Having more environmental data may enable remote-operated vehicles to be controlled with a higher degree of precision…” (Rastoll 2, ¶[0002]) and “…a camera video stream alone may be insufficient for the remote driver to perceive whether a vehicle can safely maneuver around a concrete pylon; however, supplementing the camera data with other sensor data (e.g., through fusion of image data from one or more cameras with data from a radar sensor and data from a steering angle sensor) not only enables determining the distance between the vehicle and the concrete pylon with a high degree of precision, but also enables determining whether the concrete pylon is in the vehicle's predicted path of travel” (Rastoll 2, ¶[0021]).
Regarding claim 10, the claim limitations recite a non-transitory machine-readable memory medium having limitations similar to those of claim 1 and is therefore rejected on the same basis, as outlined above. Regarding the additional limitations recited in claim 10, Rastoll 1 further discloses:
A non-transitory machine-readable memory medium on which is stored a computer program for control of a vehicle by an operator, the computer program, when executed by a computer, causing the computer to perform the following steps:
(Rastoll 1, FIG. 1; ¶[0033]: vehicle control system, storage medium, electronic control unit; ¶[0023], ¶[0026], ¶[0044]: remote operator vehicle control).
Regarding claim 2, Rastoll 1, Van Beek, and Rastoll 2 teach the method as recited in claim 1. Rastoll 1 further discloses:
wherein the observation of the operator is carried out by detecting: stress level of the operator, and/or viewing direction of the operator, and/or behavior of the operator.
(Rastoll 1, FIG. 2; FIG. 3A; FIG. 4A; ¶[0051]: impediment annotated by remote operator, i.e. a behavior of the operator).
Regarding claim 3, Rastoll 1, Van Beek, and Rastoll 2 teach the method as recited in claim 1. Rastoll 1 and Van Beek further teach:
further comprising the following steps:
retrieving parameters for upcoming routes and/or areas from a second database for storing situation-related and/or location-related detection, fusion, and display parameters;
(Rastoll 1, FIG. 4A, FIG. 7, FIG. 8: local map navigation, update to cloud; ¶[0028], ¶[0032]: scene information including fusion maps of the environment; ¶[0053]: path information, GNSS, information from vehicle sensors;
Where AV vehicles can retrieve an updated NAV map including sensor fusion maps and path information (retrieving parameters for upcoming routes and/or areas from a second database) where the scene information includes vehicle sensor information and fusion maps annotated by the remote operator (for storing situation-related and/or location-related detection, fusion, and display parameters))
adapting the defined set of sensors, whose data are transmitted;
(Van Beek, FIG. 58A; FIG. 60; FIG. 61; FIG. 62B; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; ¶[0432]-¶[0437]: sensor weights may be zero, i.e. not used; Where different vehicle sensors are given different weights, i.e. some not used, based on vehicle context (adapting the defined set of sensors))
(Rastoll 1, ¶[0028]: scene information and path information updated; ¶[0044]: different sensors; Where the scene information is updated in the NAV maps (whose data are transmitted))
adapting the fusion of the data of the defined set of sensors; and
(Van Beek, FIG. 58A; FIG. 60; FIG. 61; FIG. 62B; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; ¶[0429]: sensor fusion-context dictionary; ¶[0432]-¶[0437]: sensor weights may be zero, i.e. not used;
Where the different vehicle context weights/uses different sets of sensors and fuses the sensors’ data according to the sensor fusion policy (adapting the fusion of the data of the defined set of sensors))
adapting the display for the operator.
(Rastoll 1, ¶[0044]: operator uses additional sensors due to limited visibility; Where the use of additional sensors affects the remote operator’s visibility).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 and Rastoll 2 with the features taught by Van Beek for at least the same reasons outlined in claim 1.
Regarding claim 4, Rastoll 1, Van Beek, and Rastoll 2 teach the method as recited in claim 1. Van Beek further teaches:
wherein the fusion of the data of the defined set of sensors uses reduced data.
(Van Beek, FIG. 5; FIG. 70; FIG. 71; ¶[0127]: data filtering; ¶[0077]-¶[0078]; Where the sensor data is filtered prior to fusion).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 and Rastoll 2 with the features taught by Van Beek because “…the filtering action may use a sensor noise model to properly account and suppress noise from the different types of sensors” (Van Beek, ¶[0485]).
Regarding claim 5, Rastoll 1, Van Beek, and Rastoll 2 teach the method as recited in claim 1. Rastoll 1 further discloses:
further comprising:
searching for recognized situations and/or locations in the first database;
(Rastoll 1, FIG. 4A, FIG. 7, FIG. 8: local map navigation, update to cloud; ¶[0028], ¶[0032]: scene information including fusion maps of the environment; ¶[0053]: path information, GNSS, information from vehicle sensors;
Where AV vehicles can retrieve an updated NAV map including sensor fusion maps and path information (searching for recognized situations and/or locations in the first database))
evaluating the recognized situations and/or locations;
(Rastoll 1, FIG. 4A, FIG. 7, FIG. 8: local map navigation, update to cloud; ¶[0028], ¶[0032]: scene information including fusion maps of the environment; ¶[0053]: path information, GNSS, information from vehicle sensors; ¶[0094]: NAV map distributed to nearby vehicles;
Where the impediment and updated NAV map is distributed to nearby vehicles, i.e. vehicles in the recognized situation and/or location)
generating situation-adapted and/or location-adapted detection, fusion, and display parameters; and
(Rastoll 1, ¶[0028], ¶[0032]: scene information including fusion maps of the environment; ¶[0044]: operator uses additional sensors due to limited visibility; Where the remote operator uses additional sensors to enhance visibility, i.e. fusion and display parameters, and where the scene information includes sensor information and path information of the impediment, i.e. situation-adapted and/or location-adapted)
storing the situation-adapted and/or location-adapted detection, fusion, and display parameters in a second database.
(Rastoll 1, FIG. 4A, FIG. 7, FIG. 8: local map navigation, update to cloud; ¶[0028], ¶[0032]: scene information including fusion maps of the environment; ¶[0053]: path information, GNSS, information from vehicle sensors; ¶[0094]: NAV map distributed to nearby vehicles;
Where the impediment and updated NAV map is stored locally on the vehicles and updated to the cloud, i.e. stored in multiple databases).
Regarding claim 6, Rastoll 1 discloses:
A system for control of a vehicle, comprising:
(Rastoll 1, FIG. 3A; ¶[0021], ¶[0023], ¶[0026]: remote operator controls vehicle)
a teleoperation system configure to control the vehicle without direct line of sight based on pieces of vehicle and surroundings information;
(Rastoll 1, FIG. 3A; ¶[0021], ¶[0023], ¶[0026]: remote operator controls vehicle; ¶[0028]: scene information; ¶[0044]: remote operator selects sensors; Where the remote operator uses a remote system to control the vehicle based on sensor information and scene information)
sensors, which enable a surroundings model of the vehicle;
(Rastoll 1, FIG. 1; ¶[0051]: providing information from vehicle sensors; ¶[0044]: remote operator may use additional sensors; Where vehicle sensors provide information to the remote operator and where the operator can select additional sensors to improve scene visibility)
a predictive map to select a defined set of sensors and fuse the data of the defined set of sensors, which is configured to indicate whether and how data of individual sensors of the defined set of sensors are fused with one another;
(Rastoll 1, FIG. 2; FIG. 3A; FIG. 3B; FIG. 4A; FIG. 7; ¶[0027], ¶[0028], ¶[0045], ¶[0046]: NAV map, sensors detect impediment; ¶[0028], ¶[0071], ¶[0072]: sensor fusion maps; ¶[0053]: GNSS path information; ¶[0051]-¶[0052]: sensor information stored
Where the NAV map (a predictive map) indicates locations of annotated impediments in which sensor fusions maps are stored which include the sensors and fusion thereof for the impediment (to select a defined set of sensors and fuse the data of the defined set of sensors, which is configured to indicate whether and how data of individual sensors of the defined set of sensors are fused with one another))
a wireless network configured to transmit data of the sensors;
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; FIG. 8; ¶[0053]: path information and vehicle sensor information transmitted)
a control center configured to control the vehicle; and
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; FIG. 8; ¶[0053]: path information and vehicle sensor information transmitted to remote operator; ¶[0109]: remote control center 885).
Rastoll 1 is silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Van Beek teaches:
a training system configured to train the predictive map to select the defined set of sensors and use the defined set of sensors as a function of location, and/or situation, and/or preferences of the operator,
(Van Beek, FIG. 58A; FIG. 60; FIG. 61; FIG. 62B; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; ¶[0432]-¶[0437]: sensor weights may be zero, i.e. not used; ¶[0115]-¶[0116]: supervised learning- similar to using human annotated training data; Where a machine learning algorithm trains a model to adjust vehicle sensor weights based on context- see FIG. 61- and where the model is trained using supervised learning)
wherein the fusion of sensor data includes dynamically adjusting a weighting of individual sensors based on at least one of the location, situation, or predictive map data
(Van Beek, FIG. 58A; FIGs. 60-62B; ¶[0065]-¶[0068]; ¶[0149]; ¶[0417]-¶[0422]: training machine learning model to adjust weight given to set of sensors based on vehicle context; Where each sensor is weighted based on the context, i.e. the situation; for example the LIDAR sensor is weighted more heavily than the camera sensor at night).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 with the features taught by Van Beek because “…non-uniform data sampling may significantly reduce the requirement of resources and the cost of the overall processing pipeline. Instead of sampling data from every sensor at a set interval (e.g., every second), the sampling of one or more sensors may be customized based on context” (Van Beek, ¶[0415]). See also ¶[0416] and ¶[0421].
Rastoll 1 and Van Beek are silent on the following limitations, bolded for emphasis. However, in the same field of endeavor, Rastoll 2 teaches:
wherein the operator controls the vehicle remotely without direct line of sight based on pieces of vehicle and surroundings information.
(Rastoll 2, FIG. 3; FIG. 10; ¶[0005]-¶0006]: fusing sensor data to create visual representation of vehicle environment; ¶[0037]-¶[0039]: fused sensor data displayed to remote operator; ¶[0008]: remote operator controls vehicle in response to viewing the visual representation; Where the remote vehicle operator controls the vehicle remotely without using direct sight based on a visual representation of the vehicle and surrounding information; this is similar to the claimed invention as described in FIGs. 2-3 and the last paragraph of page 17 to the first paragraph of page 19 of the specification).
It would have been obvious to a person having ordinary skill in the art prior to the effective filing date to combine the invention of Rastoll 1 and Van Beek with the features taught by Rastoll 2 because “Having more environmental data may enable remote-operated vehicles to be controlled with a higher degree of precision…” (Rastoll 2, ¶[0002]) and “…a camera video stream alone may be insufficient for the remote driver to perceive whether a vehicle can safely maneuver around a concrete pylon; however, supplementing the camera data with other sensor data (e.g., through fusion of image data from one or more cameras with data from a radar sensor and data from a steering angle sensor) not only enables determining the distance between the vehicle and the concrete pylon with a high degree of precision, but also enables determining whether the concrete pylon is in the vehicle's predicted path of travel” (Rastoll 2, ¶[0021]).
Regarding claim 7, Rastoll 1, Van Beek, and Rastoll 2 teach the system as recited in claim 6. Rastoll 1 further discloses:
further comprising: a backend, in which the data of the defined set of sensors are processed between the wireless network and the control center.
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; FIG. 8; FIG. 9; ¶[0053]: path information and vehicle sensor information transmitted to remote operator; ¶[0109]: remote control center 885; ¶[0033]: vehicle sensor data processed by vehicle control system 101; Where the vehicle sensor data is processed by vehicle control system 101, between the wireless network and the remote control center).
Regarding claim 8, Rastoll 1, Van Beek, and Rastoll 2 teach the system as recited in claim 7. Rastoll 1 further discloses:
wherein the backend is a data processing control center.
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; FIG. 8; FIG. 9; ¶[0053]: path information and vehicle sensor information transmitted to remote operator; ¶[0109]: remote control center 885; ¶[0033]: vehicle sensor data processed by vehicle control system 101; Where the vehicle sensor data is processed by vehicle control system 101, separate from the control center).
Regarding claim 9, Rastoll 1, Van Beek, and Rastoll 2 teach the system as recited in claim 6. Rastoll 1 further discloses:
wherein the fusion of the data of the individual sensors takes place in the vehicle, in a backend between the vehicle to be controlled and the operator, or on a computer of the operator.
(Rastoll 1, FIG. 3A; FIG. 4A; FIG. 7; FIG. 8; FIG. 9; ¶[0028]: vehicle scene information includes data fusion maps; ¶[0109]: remote control center 885; ¶[0033]: vehicle sensor data processed by vehicle control system 101; Where the vehicle sensor data is processed by vehicle control system 101, including data fusion maps, i.e. created in the vehicle).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Herman (US 20200254931 A1) discloses vehicle-rendering generation for vehicle display based on short-range communication. An example vehicle includes a communication module configured to wirelessly communicate with remote sensing devices, a display configured to present a vehicle rendering of the vehicle, and a controller. The controller is configured to, in response to detecting an activation event, collect sensing data via the communication module. The controller also is configured to determine whether an object is coupled to the vehicle based on the sensing data and, in response to detecting an object coupled to the vehicle, modify the vehicle rendering to include an object rendering of the object.
Fairley et al. (US 20220001892 A1) discloses a system for dynamic policy curation includes a computing system and interfaces with an autonomous agent. A method for dynamic policy curation includes collecting a set of inputs; processing the set of inputs; and determining a set of available policies based on processing the set of inputs. Additionally or alternatively, the method can include any or all of: selecting a policy; implementing a policy; and/or any other suitable processes.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tawri M McAndrews whose telephone number is (571)272-3715. The examiner can normally be reached M-W (0800-1000).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached at (571)270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/T.M.M./ Examiner, Art Unit 3668