DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Pending
1-20
35 U.S.C. 102
1-20
Priority
Applicant’s indication of Domestic Benefit information based on provisional application 63/615,709 filed 12/28/2023 is acknowledged. The subject matter of the provisional application is substantially the same as that of the instant application, and thus the filing date of the provisional application is being used as the effective filing date for the instant application.
Information Disclosure Statement
The information disclosure statement(s) (IDS(s)) submitted on 04/01/2025 and 07/21/2025 is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement(s) is/are being considered by the examiner.
Duplicate Claim Warning
Applicant is advised that should claim 15 be found allowable, claim 16 will be objected to under 37 CFR 1.75 as being a substantial duplicate thereof. When two claims in an application are duplicates or else are so close in content that they both cover the same thing, despite a slight difference in wording, it is proper after allowing one claim to object to the other as being a substantial duplicate of the allowed claim. See MPEP § 608.01(m).
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Herman et al. (US 2022/0121950 A1, “Herman”, included in IDS filed 04/01/2025).
Regarding claim 1: Herman teaches: A method of operating an autonomous vehicle, comprising: ([0035] operate vehicle in autonomous mode. [0038] control vehicle systems)
accessing sensor data captured by at least one sensor corresponding to the autonomous vehicle associated with operation of the autonomous vehicle in an environment, the environment characterized by one or more environmental conditions; ([0065] exposure of radar sensor components, antenna, to weather conditions such as rain, snow, wind, can cause radar signal absorption. Exposure of radar sensor result in change of sensor detection characteristics. [0066] when radar sensor radome is missing, atmospheric moisture impact radar sensor operation, causing change of object data resolution. computer is programmed to increase operating frequency or operating power level of radar sensor upon determining, based on determined weather condition, presence of moisture on radar sensor. [0067] absence of radome result in higher probability of water or ice formation onto vehicle body. Such ice formation typically causes blockage of radar signal and thereby lack of detections by radar sensor. [0068] algorithm to “blocked,” based on weather condition, missing radome. blockage of radar causes deactivation of autonomous operation or ignoring radar sensor data)
generating, based on the sensor data and with a machine-learned model, an output that indicates a sensor support level in the environment, ([0018] determine radar detection range based on number of objects detected by radar sensor, second sensor, in plurality of ranges from vehicle. [0019] train ML program by applying set of training data including inputs and expected output, “the radar radome is missing” or “the radar radome is present”.[0020] weather data include data concerning snow, rain, wind. [0021] aerodynamic drag of vehicle based on vehicle speed, fuel consumption, engine torque, road slope. [0022] method, inputting to trained ML program, sensor fusion error that measures statistical correlation of data received from radar sensor and second sensor in vehicle, radar detection range, amount of reflection from radar radome, weather data, aerodynamic data that measures aerodynamic drag opposing vehicle motion, and outputting from trained ML program determination concerning presence of radar radome. [0023] applying trained NN to second sensor data, wherein ML program output determination concerning presence of radar radome based on received second sensor data)
wherein the machine-learned model is trained using training data, the training data comprising a plurality of instances of logged sensor data depicting examples of a reference object, ([0028] training ML program by applying set of training data including inputs and expected output. [0047] CNN trained to determine whether sensor radome present by processing sensor fusion error, radar detection range, reflection from radar radome, weather, aerodynamic drag. [0048] CNN trained by inputting ground truth data, and backpropagating results to be compared with ground truth to determine loss function. determining parameters that minimize loss function. [0052] vehicle include sensors, camera sensor, data fusion. [0063] NN trained to determine whether radome is missing, that front bumper of vehicle is missing, based on determined aerodynamic drag, vehicle speed, engine torque. [0096] training NN for determining whether radar sensor radome is present. [0097] computer receives training data. training data specify values of inputs and expected outputs (“the radar radome is missing” or “the radar radome is present”). [0098] computer trains NN. performs iterative routine until difference between actual output of NN and expected output is less than specified threshold)
each instance of the plurality of instances of logged sensor data being associated with a label indicating a range at which the reference object was detected in the instances of logged sensor data, and ([0047] CNN trained to determine whether sensor radome present by processing sensor fusion error, radar detection range, reflection from radar radome, weather, aerodynamic drag. [0048] CNN trained by inputting ground truth data, and backpropagating results to be compared with ground truth to determine loss function. determining parameters for convolutional and fully-connected layers that minimize loss function. When trained, CNN determine presence of radar sensor radome based on input data. [0049] output state of NN is back-propagated to compare with ground truth to determine loss function. parameters with most correct results saved as parameters used to program CNN during operation. [0050] NN is considered “trained.” [0051] FIG. 5, vehicle radar sensor and respective radome of sensor and locations of multiple objects relative to coordinate system. when radome of radar sensor is missing, then identify object in incorrect location, identified as object, or other types of errors. estimated relative radial velocity is incorrect, return signal intensity of reflection from an object is incorrect, noise floor incorrect resulting in incorrect object detection, localization. [0052] vehicle include sensors, camera sensor, data fusion. [0053] specify location of radar sensor coordinate system relative to coordinate system or second sensor coordinate system, and pose (roll, pitch, and yaw) of radar sensor relative to coordinate system or second sensor coordinate system. [0054] radar sensor and second sensor detect features at similar locations (relative distance between reported object data of sensors is less than threshold) relative to coordinate system. [0055] Extrinsic calibrations can become obsolete if intrinsic calibration values of any of sensors used for sensor fusion change)
wherein the reference object is not depicted in the sensor data captured by the at least one sensor; and ([0045] FIG. 4, incorrectly detecting an object 310 instead of actual object 300. computer may provide incorrect or undesired output upon falsely detecting the object 310, e.g., may actuate the vehicle 100 to brake because of the incorrectly detected object 310. “Incorrectly detected,” in the present context includes determining a wrong location (object 310) for an existing object 300, failing to detect an object, and/or detecting a “ghost” object, i.e., an object that is not physically present. [0046] To avoid a false detection, computer programmed to determine whether radome of a sensor is present, and to operate the vehicle based on such determination. filtering mechanism such as a Kalman, Bayesian, used to filter inputs to the machine learning program)
controlling the autonomous vehicle based at least in part on sensor support level ([0065] exposure of radar sensor components, antenna, to weather conditions such as rain, snow, wind, can cause radar signal absorption. Exposure of radar sensor result in change of sensor detection characteristics. [0066] when radar sensor radome is missing, atmospheric moisture impact radar sensor operation, causing change of object data resolution. computer is programmed to increase operating frequency or operating power level of radar sensor upon determining, based on determined weather condition, presence of moisture on radar sensor. [0067] absence of radome result in higher probability of water or ice formation onto vehicle body. Such ice formation typically causes blockage of radar signal and thereby lack of detections by radar sensor. [0068] algorithm to “blocked,” based on weather condition, missing radome. blockage of radar causes deactivation of autonomous operation or ignoring radar sensor data).
Regarding claim 2: Herman teaches: The method of claim 1, further comprising: selecting a distance from a plurality of distances having a corresponding sensor support quantity that meets a detectability threshold, wherein the output of the machine-learned model comprises the plurality of sensor support quantities over a plurality of distances, a first sensor support quantity corresponding to the first distance of the plurality of distances and a second sensor support quantity corresponding to a second distance of the plurality of distances ([0054] distance between location coordinates of an object reference point received from the radar compared to reference point received from a second sensor (camera, lidar) may be less than a threshold. [0056] Sensor fusion error measures a statistical correlation of data received from a radar and second sensor in vehicle. error increases as average distance or difference in azimuth angles of objects detected by second sensor and location coordinates of objects detected by radar sensor increases. [0060] missing radome result in a changed a radar sensor range. computer to determine the radar detection range based on number of objects detected by radar and second sensor, in plurality of respective ranges from vehicle. [0061] computer to determine that radar detection range is reduced from 100 m to 80 m, upon determining that second sensor detects an object within 82 m from vehicle whereas the radar is unable to detect object, and determining that radar and second sensor detect an object at distance of 80 m from vehicle).
Regarding claim 3: Herman teaches: The method of claim 2, the sensor support quantity indicating at least one of a number of lidar points per unit surface area of the reference object that would be returned, a number of radar points per unit surface area of the reference object that would be returned, a result of applying an object detection mask to a portion of the sensor data depicting the reference object, or a result of a second machine-learned model that is trained to identify the reference object in at least a portion of the sensor data ([0043] radar determine distance to object based on “time-to-travel” to object. amount of time in which a transmitted beam travels from an antenna to the object, object reflects beam, and reflection of transmitted beam is received by antenna. change in return frequency gives relative radial velocity between radar and object. Radar sensor output 2D or 3D. detect vehicle body parts based on time-to-travel. default base reflection pattern include data specifying characteristics, time-to-travel, signal amplitude, frequency, of reflections expected. [0047] CNN trained to determine whether sensor radome present by processing sensor fusion error, radar detection range, reflection from radar radome, weather, aerodynamic drag. [0054] radar sensor and second sensor detect features at similar locations (relative distance between reported object data of sensors is less than threshold) relative to coordinate system).
Regarding claim 4: Herman teaches: The method of claim 2, the detectability threshold indicating a threshold number of returned points per unit surface area of the reference object ([0043] default base reflection pattern of a radar sensor may include data specifying characteristics, time-to-travel, signal amplitude, frequency, of reflections expected to be received from body. missing radome typically results in change of base reflection pattern. FIG. 3, changes of radar sensor gain versus frequency when the radome is present or missing. missing radome may result in changes a return coefficient as a function of angle, and/or return intensity versus distance, frequency, and/or angle. [0051] each incorrectly identified object is detection of respective object nearest to location of the incorrectly identified object. estimated relative radial velocity may be incorrect, return signal intensity of reflection from and object may be higher than nominal, noise floor may increase or decrease relative to a nominal vehicle configuration resulting in incorrect object detection, localization).
Regarding claim 5: Herman teaches: The method of claim 2, the sensor data comprising image data captured by a camera, the detectability threshold indicating a threshold result of applying an object detection mask to a portion of the image data ([0054] radar sensor and second sensor detect features at similar locations. [0043] radar sensor determine distance to object based on “time-to-travel” to object. FIG. 3 shows exemplary graph which illustrates changes of radar sensor gain versus frequency when radome is present, whereas exemplary graph illustrates changes of radar sensor gain versus frequency when radome is missing. missing radome result in changes return coefficient as function of angle, or return intensity versus distance, frequency, or angle. [0048] CNN trained by inputting ground truth data, and backpropagating results to be compared with ground truth to determine loss function. determining parameters for convolutional and fully-connected layers that minimize loss function. When trained, CNN determine presence of radar sensor radome based on input data. [0050] NN is considered “trained”).
Regarding claim 6: Herman teaches: The method of claim 2, the sensor data comprising image data captured by a camera, the detectability threshold indicating an output of a second machine-learned model trained to identify the reference object in the image data ([0054] radar sensor and second sensor detect features at similar locations (relative distance between reported object data of sensors is less than threshold) relative to coordinate system. [0022] method, inputting to trained ML program, sensor fusion error that measures statistical correlation of data received from radar sensor and second sensor in vehicle, radar detection range, amount of reflection from radar radome, weather data, aerodynamic data that measures aerodynamic drag opposing vehicle motion, and outputting from trained ML program determination concerning presence of radar radome. [0023] applying trained NN to second sensor data, wherein ML program output determination concerning presence of radar radome based on received second sensor data. [0052] vehicle include sensors, camera sensor, data fusion).
Regarding claim 7: Herman teaches: The method of claim 1, wherein the sensor support level comprises an indication of a visibility classification in the one or more environmental conditions ([0068] algorithm to “blocked,” based on weather condition, missing radome. blockage of radar causes deactivation of autonomous operation or ignoring radar sensor data. [0065] exposure of radar sensor components, antenna, to weather conditions such as rain, snow, wind, can cause radar signal absorption. Exposure of radar sensor result in change of sensor detection characteristics. [0066] when radar sensor radome is missing, atmospheric moisture impact radar sensor operation, causing change of object data resolution. computer is programmed to increase operating frequency or operating power level of radar sensor upon determining, based on determined weather condition, presence of moisture on radar sensor. [0022] method, inputting to trained ML program, sensor fusion error that measures statistical correlation of data received from radar sensor and second sensor in vehicle, radar detection range, amount of reflection from radar radome, weather data, aerodynamic data that measures aerodynamic drag opposing vehicle motion, and outputting from trained ML program determination concerning presence of radar radome).
Regarding claim 8: Herman teaches: The method of claim 7, wherein the visibility classification is one of nominal, degraded, or severely degraded ([0065] Exposure of the radar sensor to weather conditions may result in a change of a sensor detection characteristics, e.g., detecting an object 310 instead of the object 300, which may be due to a higher likelihood of ice, snow buildup on the radar sensor 130 compared to the radome (i.e. degraded). [0044] default base reflections (i.e. nominal). [0051] return signal intensity of reflection from and object may be higher than nominal, a noise floor may increase or decrease relative to a nominal vehicle configuration resulting in incorrect object detection, localization. [0068] adjust an operating parameter of a radar sensor blockage detection algorithm, e.g., setting an output of the algorithm to “blocked,” based on the weather condition and the determined missing radome. blockage of radar sensor may result in a deactivation of an autonomous operation and/or ignoring the radar sensor data for the vehicle operation (i.e. severely degraded)).
Regarding claim 9: Herman teaches: The method of claim 1, further comprising: executing the machine-learned model using at least a portion of the training data as input to generate a training output of the machine-learned model; comparing the training output of the machine-learned model to the label data; and modifying the machine-learned model based at least in part on the comparing of the training output of the machine-learned model to the label data ([0019]-[0023] method, inputting to trained ML program, sensor fusion error that measures statistical correlation of data received from radar sensor and second sensor in vehicle, radar detection range, amount of reflection from radar radome, weather data, aerodynamic data that measures aerodynamic drag opposing vehicle motion, and outputting from trained ML program determination concerning presence of radar radome. applying trained NN to second sensor data, wherein ML program output determination concerning presence of radar radome based on received second sensor data. [0028] training ML program by applying set of training data including inputs and expected output. [0047]-[0050] CNN trained to determine whether sensor radome present by processing sensor fusion error, radar detection range, reflection from radar radome, weather, aerodynamic drag. CNN trained by inputting ground truth data, and backpropagating results to be compared with ground truth to determine loss function. determining parameters for convolutional and fully-connected layers that minimize loss function. When trained, CNN determine presence of radar sensor radome based on input data. output state of NN is back-propagated to compare with ground truth to determine loss function. parameters with most correct results saved as parameters used to program CNN during operation. NN is considered “trained.” [0098] computer trains NN. performs iterative routine until difference between actual output of NN and expected output is less than specified threshold).
Regarding claim 10: Herman teaches: The method of claim 8, each instance of the logged sensor data also being associated with label data indicating a visibility classification for the respective instance of the logged sensor data ([0068] algorithm to “blocked,” based on weather condition, missing radome. blockage of radar causes deactivation of autonomous operation or ignoring radar sensor data. [0065] exposure of radar sensor components, antenna, to weather conditions such as rain, snow, wind, can cause radar signal absorption. Exposure of radar sensor result in change of sensor detection characteristics. [0066] when radar sensor radome is missing, atmospheric moisture impact radar sensor operation, causing change of object data resolution. computer is programmed to increase operating frequency or operating power level of radar sensor upon determining, based on determined weather condition, presence of moisture on radar sensor. [0022] method, inputting to trained ML program, sensor fusion error that measures statistical correlation of data received from radar sensor and second sensor in vehicle, radar detection range, amount of reflection from radar radome, weather data, aerodynamic data that measures aerodynamic drag opposing vehicle motion, and outputting from trained ML program determination concerning presence of radar radome).
Regarding claim 11: Herman teaches: An autonomous vehicle comprising: at least one processor programmed to perform operations comprising ([0035], [0038] control vehicle systems. [0010] computer including a processor and memory. stores instructions executable by processor so computer is programmed to input to trained machine learning program)
accessing sensor data captured by at least one sensor corresponding to the autonomous vehicle associated with operation of the autonomous vehicle in an environment, the environment characterized by one or more environmental conditions; ([0065]-[0068])
generating, based on the sensor data and a machine-learned model, an output that indicates a sensor support level in the environment, ([0018]-[0023])
wherein the machine-learned model is trained using training data, the training data comprising a plurality of instances of logged sensor data depicting examples of a reference object, ([0028], [0047]-[0048], [0052], [0063], [0096]-[0098])
each instance of the plurality of instances of logged sensor data being associated with a label indicating a range at which the reference object was detected in the instances of logged sensor data, and ([0047]-[0055])
wherein the reference object is not depicted in the sensor data captured by the at least one sensor; and ([0045]-[0046])
controlling the autonomous vehicle based at least in part on sensor support level ([0065]-[0068]).
Regarding claim 12: Herman teaches: The autonomous vehicle of claim 11, the operations further comprising selecting a distance from a plurality of distances having a corresponding sensor support quantity that meets a detectability threshold, wherein the output of the machine-learned model comprises the plurality of sensor support quantities over a plurality of distances, a first sensor support quantity corresponding to a first distance of the plurality of distances and a second sensor support quantity corresponding to the second distance of the plurality of distances ([0054], [0056], [0060], [0061]).
Regarding claim 13: Herman teaches: The autonomous vehicle of claim 12, the sensor support quantity indicating at least one of a number of lidar points per unit surface area of the reference object that would be returned, a number of radar points per unit surface area of the reference object that would be returned, a result of applying an object detection mask to a portion of the sensor data depicting the reference object, or a result of a second machine-learned model that is trained to identify the reference object in at least a portion of the sensor data ([0043], [0047], [0054]).
Regarding claim 14: Herman teaches: The autonomous vehicle of claim 12, the detectability threshold indicating a threshold number of returned points per unit surface area of the reference object ([0043], [0051]).
Regarding claim 15: Herman teaches: The autonomous vehicle of claim 12, the sensor data comprising image data captured by a camera, the detectability threshold indicating an output of a second machine-learned model trained to identify the reference object in the image data ([0054], [0043], [0048], [0050]).
Regarding claim 16: Herman teaches: The autonomous vehicle of claim 12, the sensor data comprising image data captured by a camera, the detectability threshold indicating an output of a second machine-learned model trained to identify the reference object in the image data ([0054], [0043], [0048], [0050]).
Regarding claim 17: Herman teaches: The autonomous vehicle of claim 11, wherein the sensor support level comprises an indication of a visibility condition in the environment ([0068], [0065]-[0066], [0022]).
Regarding claim 18: Herman teaches: The autonomous vehicle of claim 11, the operations further comprising: executing the machine-learned model using at least a portion of the training data as input to generate a training output of the machine-learned model; comparing the training output of the machine-learned model to the label data; and modifying the machine-learned model based at least in part on the comparing of the training output of the machine-learned model to the label data ([0019]-[0023], [0028], [0047]-[0050], [0098]).
Regarding claim 19: Herman teaches: The autonomous vehicle of claim 17, each instance of the logged sensor data also being associated with label data indicating a visibility classification for the respective instance of the logged sensor data ([0068], [0065]-[0066], [0022]).
Regarding claim 20: Herman teaches: At least one non-transitory computer-readable storage media comprising instructions thereon that, when executed by at least one processor, because the at least one processor to perform operations comprising: ([0035], [0038], [0010] computer including a processor and memory. stores instructions executable by processor so computer is programmed to input to trained machine learning program. [0101] storage, ROM, RAM)
accessing sensor data captured by at least one sensor corresponding to an autonomous vehicle associated with operation of the autonomous vehicle in an environment, the environment characterized by one or more environmental conditions; ([0065]-[0068])
generating, based on the sensor data and a machine-learned model, an output that indicates a distance at which a reference object would meet a detectability threshold in the environment; and ([0018]-[0023])
controlling the autonomous vehicle based at least in part on the distance or a visibility classification derived from the distance ([0065]-[0068]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MADISON B EMMETT whose telephone number is (303)297-4231. The examiner can normally be reached Monday - Friday 9:00 - 5:00 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tommy Worden can be reached at (571)272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MADISON B EMMETT/Examiner, Art Unit 3658