DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is a Non-Final rejection on the merits of this application. Claims 1-6, 8, and 14-18 are currently pending, as discussed below.
Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art.
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 21 January 2025 has been entered.
Response to Amendment and/or Argument
Applicant’s amendments and/or arguments with respect to the claim interpretations under 35 USC 112(f) as set forth in the office action of 21 October 2024 have been considered and are persuasive. Therefore, the claim interpretations under 35 USC 112(f) as set forth in the office action of 21 October 2024 have been withdrawn.
Applicant’s amendments and/or arguments with respect to the Claim rejections of Claim 1, 9, 11, 12 and 14 under 35 USC 112(a) and 35 USC 112(b) as set forth in the office action of 21 October 2024 have been considered and are persuasive. Therefore, the Claim rejections of Claim 1, 9, 11, 12 and 14 under 35 USC 112(a) and 35 USC 112(b) as set forth in the office action of 21 October 2024 have been withdrawn.
Applicant’s amendments and/or arguments with respect to the Claim Rejections of Claim 1-6 and 8-14 under 35 USC 101 as set forth in the office action of 21 October 2024 have been considered and are persuasive. Therefore, the Claim Rejections of Claim 1-6 and 8-14 under 35 USC 101 as set forth in the office action of 21 October 2024 have been withdrawn.
Applicant’s arguments with respect to claim(s) 1-6, 8, 14 -18 under 35 U.S.C. 103 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Objections
Claims 1 and 14 are objected to because of the following informalities:
Claim 1, Line 4-5: “of the vehicle” should read –of the at least semi-automated vehicle—
Claim 14, Lines 6-7: “of the vehicle” should read –of the at least semi-automated vehicle—
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of the first paragraph of 35 U.S.C. 112(a):
(a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention.
The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112:
The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention.
Claims 1 and 14 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
Regarding claim 1 (similarly claim 14), Applicant has apparently not described, in the specification, in sufficient details the different types of data mentioned in the claims because the specification does not definite or explain (by algorithm(s) or examples) what any and/or all of “data of sensor system for environmental conditions of surroundings of the vehicle”, “data about environmental conditions”, “data about environmental conditions of the surroundings for the detection conditions”, “data of the sensor system for determining surroundings” and “data for surroundings for the detection conditions” mean, how they differ from each other, how any/all data is obtained or used. The published specification [0090] merely restates "Data of sensor systems for environmental conditions 212 of the surroundings, data about environmental conditions 214 of the surroundings, data of sensor systems for determining surroundings 312 and data for surroundings 314, for example topographical information and/or map information and/or information made available by other mobile platforms (V2X), represent input data for the evaluation device for detection conditions 210." Accordingly, the Examiner believes that Applicant has not demonstrated to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention.
The dependent claims are also rejected under 112 first paragraph by the fact that they are dependent upon the rejected independent claims.
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1 and 14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 1 (similarly claim 14) recites the limitation “providing data of sensor systems for environmental conditions of surroundings of the vehicle and data about environmental conditions of the surroundings for the detection conditions; providing data of the sensor systems for determining surroundings and data for surroundings for the detection conditions” is indefinite and unclear to the Examiner because these different (or same?) data are not clearly distinguished, for example, what the difference is between the "data of sensor systems for environmental conditions" versus "data of sensor systems about environmental conditions”, the difference between "data of the sensor systems for determining surroundings", "data for the surroundings for the detection conditions", and "determining detection conditions"; and further are these the same data or different data (e.g. raw sensor data vs post-processed data/information) because the published specification [0090] merely restates "Data of sensor systems for environmental conditions 212 of the surroundings, data about environmental conditions 214 of the surroundings, data of sensor systems for determining surroundings 312 and data for surroundings 314, for example topographical information and/or map information and/or information made available by other mobile platforms (V2X), represent input data for the evaluation device for detection conditions 210." Hence this limitation renders the claim to be indefinite.
Claim 1 (similarly claim 14) recites the limitation “the overall integrity value being based on a first integrity value and a second integrity value, wherein each of the first integrity value and the second integrity value is determined using an automotive safety integrity level (ASIL) provided by the sensor systems” is indefinite and unclear to the Examiner (from teachings of the specification) how the first integrity value and second integrity value is/are provided by the sensor system, e.g. do sensors output an ASIL value or does the system use sensors with known ASIL level/ratings to compute an individual/overall integrity value, hence this claim limitation renders the claim to be indefinite.
The dependent claims are also rejected under 112 second paragraph by the fact that they are dependent upon the rejected independent claims.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 1-6, 8 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Amirloo Abolfathi (US 2019/0064799 A1 hereinafter Abolfathi) in view of Mihai (DE1020192110006 A1_English Translation).
Regarding claim 1 (Similarly claim 14), Abolfathi teaches A method for evaluating detection conditions and controlling an at least semi-automated vehicle (see at least Abstract), the method comprising:
providing data of sensor systems for environmental conditions of surroundings of the vehicle and data about environmental conditions of the surroundings for the detection conditions; (see at least Fig. 1-5 [0035-0077]: the reliability of the vehicle system is assessed based on condition data received wherein the condition data may include vehicle system condition data regarding the status of a vehicle system and/or environmental condition data regarding the status of the environment around the vehicle. The condition data may be received from some combination of vehicle’s sensor system, the vehicle system diagnostic sensors and/or external sources of information. When the reliability assessment system assess the reliability of an assessed vehicle system, it may combine data from multiple sources, e.g. environmental condition data may be used to assess the reliability of one or more sensor systems or non-sensor system (e.g. environmental condition data indicating night-time low-light conditions or rain, could both negatively affect the assessment of the reliability of the camera system).
providing data of the sensor systems for determining surroundings and data for surroundings for the detection conditions; (see at least Fig. 1-5 [0035-0077]: the reliability of the vehicle system is assessed based on condition data received wherein the condition data may include vehicle system condition data regarding the status of a vehicle system and/or environmental condition data regarding the status of the environment around the vehicle. The condition data may be received from some combination of vehicle’s sensor system, the vehicle system diagnostic sensors and/or external sources of information. When the reliability assessment system assess the reliability of an assessed vehicle system, it may combine data from multiple sources, e.g. environmental condition data may be used to assess the reliability of one or more sensor systems or non-sensor system (e.g. environmental condition data indicating night-time low-light conditions or rain, could both negatively affect the assessment of the reliability of the camera system).)
determining the detection conditions; (see at least Fig. 1-5 [0035-0077]: When the reliability assessment system assess the reliability of an assessed vehicle system, it may combine data from multiple sources, e.g. environmental condition data may be used to assess the reliability of one or more sensor systems or non-sensor system (e.g. environmental condition data indicating night-time low-light conditions or rain, could both negatively affect the assessment of the reliability of the camera system; a combination of rain and faulty anti-lock braking functionality might trigger a negative assessment of the reliability of the braking system).)
evaluating the determined detection conditions based on an overall integrity value, the overall integrity value being based on a first integrity value and a second integrity value, (see at least Fig. 1-5 [0035-0077]: where the assessed vehicle system is a sensor system, the selected autonomous driving mode may indicate a weight to be given to the output of that sensor system in making driving decisions (e.g., where the reliability assessment receives condition data indicating low reliability of the camera system, the autonomous driving module may give low weight to camera system in making driving decisions and higher weight to output of other sensor system). In some examples, the reliability assessment system may incorporate features of the Functional Safety Standard (ISO 26262) for detecting system reliability.)
determining weighting factors based on the evaluation of the determined detection conditions; (see at least Fig. 1-5 [0035-0077]: where the assessed vehicle system is a sensor system, the selected autonomous driving mode may indicate a weight to be given to the output of that sensor system in making driving decisions (e.g., where the reliability assessment receives condition data indicating low reliability of the camera system, the autonomous driving module may give low weight to camera system in making driving decisions and higher weight to output of other sensor system). In some examples, the reliability assessment system 102 may incorporate features of the Functional Safety Standard (ISO 26262) for detecting system reliability.)
fusing, using the weighting factors, the data from the sensor systems for determining the surroundings and the data for the surroundings (see at least Fig. 1-5 [0035-0077]: where the assessed vehicle system is a sensor system, the selected autonomous driving mode may indicate a weight to be given to the output of that sensor system in making driving decisions (e.g., where the reliability assessment receives condition data indicating low reliability of the camera system, the autonomous driving module may give low weight to camera system in making driving decisions and higher weight to output of other sensor system).
providing a control signal for activating the at least semi-automated vehicle based on the evaluation of the determined detection conditions; (see at least Fig. 1-5 [0035-0077]: the autonomous driving module uses the vehicle system reliability data to select an autonomous driving mode. This selection may involve one or more of the steps of: setting a level of autonomy for the vehicle; assigning weights to the outputs of various sensor systems; and/or activating or deactivating one or more auxiliary vehicle systems.) and
controlling the at least semi-automated vehicle using the control signal, wherein the controlling includes transitioning the at least semi-automated vehicle into a safe state. (see at least Fig. 1-5 [0035-0077]: the autonomous driving module uses the vehicle system reliability data to select an autonomous driving mode. This selection may involve one or more of the steps of: setting a level of autonomy for the vehicle; assigning weights to the outputs of various sensor systems; and/or activating or deactivating one or more auxiliary vehicle systems. The vehicle may include further systems and features for acting on the reliability data, e.g. selecting an autonomous driving mode, activation/deactivation of auxiliary systems, warning about hazardous conditions, and so on)
It may be alleged that Abolfathi does not explicitly teach evaluating the determined detection conditions based on an overall integrity value, the overall integrity value being based on a first integrity value and a second integrity value, wherein each of the first integrity value and the second integrity value is determined using an automotive safety integrity value (ASIL) provided by the sensor system;
Mihai is directed to method and system for evaluating and fusing sensor data of a vehicle, Mihai teaches evaluating the determined detection conditions based on an overall integrity value, the overall integrity value being based on a first integrity value and a second integrity value, wherein each of the first integrity value and the second integrity value is determined using an automotive safety integrity value (ASIL) provided by the sensor system; (see at least Fig. 1 [0009-0037]: In the event of a contradiction between first sensor data with a first reliability with a second sensor data with a second reliability, sensor data with the lower reliability are disregarded or given less weight. It is also appropriate to assign a higher reliability to data of the digital environment map generated with a certified sensor, in particular with an ASIL-certified sensor, than to data generated with a non-certified sensor and/or a sensor with a lower ASIL certification. In particular, it is possible to validate a digital environment map using ASIL-certified vehicle sensors. Driving maneuvers and/or route are planned based on the fusion of sensor data with the digital environment map.)
providing a control signal for activating the at least semi-automated vehicle based on the evaluation of the determined detection conditions; (see at least Fig. 1 [0009-0037]: Driving maneuvers and/or route are planned based on the fusion of sensor data with the digital environment map wherein the driving maneuver is, for example, braking, accelerating, steering, changing lanes, turning, parking and the like.)
controlling the at least semi-automated vehicle using the control signal, wherein the controlling includes transitioning the at least semi-automated vehicle into a safe state. (see at least Fig. 1 [0009-0037]: Driving maneuvers and/or route are planned based on the fusion of sensor data with the digital environment map wherein the driving maneuver is, for example, braking, accelerating, steering, changing lanes, turning, parking and the like.)
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Abolfathi’s system and method for autonomous vehicle reliability assessment to incorporate the technique of evaluating the determined detection conditions based on an overall integrity value, the overall integrity value being based on a first integrity value and a second integrity value, wherein each of the first integrity value and the second integrity value is determined using an automotive safety integrity value (ASIL) provided by the sensor system as taught by Mihai with reasonable expectation of success and doing so would make it possible to validate grids and/or classification of objects to prevent incorrect interpretation of sensor data (Mihai [0015]).
Regarding claim 2, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 1,
Abolfathi further teaches providing a first confidence value of the determined detection conditions,(see at least Fig. 1-3 [0034--0063]: the vehicle system reliability data may take different forms, for example, an assessed camera system might distinguish between complete failure of the camera and low confidence in the camera system to detect object outside of a certain angle of vision or past a certain distance due to environment conditions. When received condition data indicating low reliability of the camera system, the autonomous driving module may decide to give low weight to the camera system in making driving decision due to low confidence in the output of camera system and will give higher weight to other sensor systems (e.g. a different threshold/greater weight may be applied to radar system. That is, providing a low confidence value to a camera system and a greater/higher confidence value to a radar system based on detecting the current environment experiencing heavy rain)
providing a second confidence value of the determined detection conditions (see at least Fig. 1-3 [0034--0063]: the vehicle system reliability data may take different forms, for example, an assessed camera system might distinguish between complete failure of the camera and low confidence in the camera system to detect object outside of a certain angle of vision or past a certain distance due to environment conditions. When received condition data indicating low reliability of the camera system, the autonomous driving module may decide to give low weight to the camera system in making driving decision due to low confidence in the output of camera system and will give higher weight to other sensor systems (e.g. a different threshold/greater weight may be applied to radar system. That is, providing a low confidence value to a camera system and a greater/higher confidence value to a radar system based on detecting the current environment experiencing heavy rain)
determining an overall confidence value for the determined detection condition, based on the first confidence value and the second confidence value; (see at least Fig. 1-3 [0034-0063]: condition data from two or more vehicle system may be combined together and/or combined with other data to determine the reliability/confidence of a given vehicle system. For example, where the received condition data indicating low reliability of camera system (e.g. due to low light conditions, glare, lens occlusion), the autonomous driving module may decide to give low weight to the camera system and higher weight to the output of other system systems, e.g., ultrasound system and GPS.)
evaluating the determined detection conditions using the overall confidence value.. (see at least Fig. 1-3 [0034-0063]: condition data from two or more vehicle system may be combined together and/or combined with other data to determine the reliability/confidence of a given vehicle system. For example, where the received condition data indicating low reliability of camera system (e.g. due to low light conditions, glare, lens occlusion), the autonomous driving module may decide to give low weight to the camera system and higher weight to the output of other system systems, e.g., ultrasound system and GPS.)
Regarding claim 3, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 2,
Abolfathi further teaches wherein the determination of the overall confidence value is based on a probabilistic combination of the first confidence value and the second confidence value. (see at least Fig. 1-4 [0055-0064]: For example, where the assessment unit 310 receives condition data 136 indicating low reliability of the camera system 120 (due to, e.g., low light conditions, glare, or lens occlusion), the autonomous driving module 312 may decide to give low weight to the camera system 120 in making driving decisions due to the low confidence in the output of the camera system 120 . Instead, its driving decision-making processes or algorithms will give higher weight to the output of other sensor systems 110 such as, e.g., its ultrasound system 126 and its GPS 128 .)
Regarding claim 4, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 1,
Abolfathi further teaches wherein the determination of the overall integrity value is based on an arithmetic function of the first integrity value and the second integrity value (see at least Fig. 1-5 [0035-0077]: where the assessed vehicle system is a sensor system, the selected autonomous driving mode may indicate a weight to be given to the output of that sensor system in making driving decisions (e.g., where the reliability assessment receives condition data indicating low reliability of the camera system, the autonomous driving module may give low weight to camera system in making driving decisions and higher weight to output of other sensor system).
Regarding claim 5, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 1, Abolfathi further teaches wherein the determined detection condition include: at least one global detection condition and/or at least one zone-specific detection condition, and/or at least one local detection condition, and/or at least one dynamic detection condition. (see at least Fig. 1-5 [0035-0077]: weather conditions, lighting conditions, signal distortion conditions (e.g. situations when radio, GPS and other signals sent to or from the vehicle are reflected or blocked by objects in the environment), glare, light reflections, visual field blockage, vehicle speed, road conditions (e.g. ice, wetness, flooding, damaged road surface, type of road surface), geographic conditions (e.g. whether the vehicle is currently in an industrial, commercial, residential, highway, or rural road area), and detection of other objects in the area (e.g. debris on the road, tree canopy cover or tunnel roofs that could block GPS signals, other vehicles that could block sensors or could pose collision hazards).)
Regarding claim 6, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 1, Abolfathi further teaches wherein the first integrity value is provided based on a first basis for determining the detection conditions, and the second integrity value is provided based on a second basis for determining the detection conditions, the first basis being different from the second basis, wherein each of the first basis and second basis: (i) is based on data of at least one sensor dedicated to the detection conditions, and/or (ii) is based on a pattern recognition for the detection conditions of data of at least one exteroceptive sensor, and/or (iii) is based on a pattern recognition for the detection conditions of data of at least two exteroceptive sensors, and/or (iv) is based on an evaluation of results of a data processing of at least one sensor, and/or (v) is based on at least one of the detection conditions, which is assigned to geographical data and describes surroundings of a corresponding sensor in greater details, and/or (vi) is based on an evaluation of map topographies, and/or (vii) is based on data provided by road users in the surroundings of the corresponding sensor. (see at least Fig. 1-5 [0035-0077]: where the assessed vehicle system is a sensor system, the selected autonomous driving mode may indicate a weight to be given to the output of that sensor system in making driving decisions (e.g., where the reliability assessment receives condition data indicating low reliability of the camera system, the autonomous driving module may give low weight to camera system in making driving decisions and higher weight to output of other sensor system). The vehicle has a number of vehicle systems, which may include both sensor systems (e.g. camera system, LIDAR system, radar system , ultrasound system, GPS 128 , external temperature sensor, precipitation sensor, speedometer, inertial measurement unit, odometer, microphone) as well as non-sensor systems 112 (e.g. steering system 135 , braking system 134 , transmission, electrical system, climate control, dashboard system).The vehicle may also have one or more vehicle system diagnostic sensors configured to diagnose status of the vehicle system. Environmental conditions can include, but are not limited to: weather conditions, lighting conditions, signal distortion conditions (e.g. situations when radio, GPS and other signals sent to or from the vehicle are reflected or blocked by objects in the environment), glare, light reflections, visual field blockage, vehicle speed, road conditions (e.g. ice, wetness, flooding, damaged road surface, type of road surface), geographic conditions (e.g. whether the vehicle is currently in an industrial, commercial, residential, highway, or rural road area), and detection of other objects in the area (e.g. debris on the road, tree canopy cover or tunnel roofs that could block GPS signals, other vehicles that could block sensors or could pose collision hazards).)
Regarding claim 8, the combination of Abolfathi in view of Mihai teaches The method as recited in claim 1,
It may be alleged that Abolfathi does not explicitly teach wherein the evaluated detection conditions are assigned to a representation of the surroundings of a sensor system
Mihai is directed to method and system for evaluating and fusing sensor data of a vehicle, Mihai teaches wherein the evaluated detection conditions are assigned to a representation of the surroundings of a sensor system. (see at least Fig. 1 [0009-0037]: the digital environment map provides additional validation option for sensor data relating to a vehicle environment making it possible to validate grids and/or the classification of objects)
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Abolfathi’s system and method for autonomous vehicle reliability assessment to incorporate the technique assigning the evaluated detection conditions to a representation of the surroundings of a sensor system as taught by Mihai with reasonable expectation of success and doing so would make it possible to validate grids and/or classification of objects to prevent incorrect interpretation of sensor data (Mihai [0015]).
Claims 15-16 are rejected under 35 U.S.C. 103 as being unpatentable over Abolfathi in view of Mihai and Tiwari et al. (US 2018/0267558 A1 hereinafter Tiwari).
Regarding claim 15 (similarly claim 16), the combination of Abolfathi in view of Mihai teaches the method as recited in claim 1 (similarly claim 14), the combination of Abolfathi in view of Mihai does not explicitly teach wherein the transitioning of the at least semi-automated vehicle into the safe state includes a slow stopping of the at least semi-automated vehicle on a road shoulder.
Tiwari is directed to system and method for collecting and processing sensor data for facilitating and/or enable autonomous, semi-autonomous operation of a vehicle, Tiwari teaches wherein the transitioning of the at least semi-automated vehicle into the safe state includes a slow stopping of the at least semi-automated vehicle on a road shoulder. (see at least Fig. 6 [0065-0094]: the system and method includes collecting surround data to perceive the surroundings of the vehicle for subsequent processing and/or determination; processing collected surrounding data to extract characteristics of the surrounding for producing at least one output that include any one or more of object localization dataset, object classification dataset, object detection dataset, feature dataset, object signature, environmental data, and any other suitable output resulting from processing surrounding data; generating a confidence metric based on an output of the processed surrounding data wherein the confidence metric describe confidence in the quality of the surrounding data; and receiving an instruction indicating a desired operation of one or more vehicle subsystem; and automatically controlling the vehicle to pull over into a shoulder region of a roadway on which the vehicle is traveling based on the confidence metric (e.g. exceeding/falling below a threshold value).
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Abolfathi and Mihai to incorporate the technique of transitioning the at least semi-automated vehicle into a safe state by slow stopping the vehicle on a road shoulder as taught by Tiwari with reasonable expectation of success to ensure autonomous vehicle’s operation safety and improving user’s confidence in autonomous driving system.
Claims 17-18 are rejected under 35 U.S.C. 103 as being unpatentable over Abolfathi in view of Mihai and Nariyambut Murali et al. (US 2017/0109644 A1 hereinafter Murali).
Regarding claim 17 (similarly claim 18), the combination of Abolfathi in view of Mihai teaches the method as recited in claim 1 (similarly claim 14), further comprising:
the combination of Abolfathi in view of Mihai does not explicitly teach providing a three-dimensional grid representing the surroundings of the vehicle,
the evaluated determined detection conditions being assigned to grid cells of the grid.
Murali is directed to system and method for sensor fusion, Murali teaches providing a three-dimensional grid representing the surroundings of the vehicle, (see at least Fig. 3-6 [0043-0067]: the model component models a sensing system of a vehicle in which the sensor fusion component or automated driving/assistance system is located. A region near or surrounding a vehicle may be logically divided into a 2d or 3d grid having a plurality of cells and each cell of the grid may have a corresponding node in a graphical model, which models for an occupancy (e.g. presence of an object in that cell), velocity, material type, object recognition, or any other object detection and tracking information. For example, each cell of the grid may correspond to a specific region near a vehicle (e.g., with a fixed location with respect to a moving vehicle) and one or more aspects of an object (or absence of an object) may be inferred for that cell based on the model and the most recently gathered sensor data.)
the evaluated determined detection conditions being assigned to grid cells of the grid. (see at least Fig. 3-6 [0043-0067]: the model component models a sensing system of a vehicle in which the sensor fusion component or automated driving/assistance system is located. A region near or surrounding a vehicle may be logically divided into a 2d or 3d grid having a plurality of cells and each cell of the grid may have a corresponding node in a graphical model, which models for an occupancy (e.g. presence of an object in that cell), velocity, material type, object recognition, or any other object detection and tracking information. For example, each cell of the grid may correspond to a specific region near a vehicle (e.g., with a fixed location with respect to a moving vehicle) and one or more aspects of an object (or absence of an object) may be inferred for that cell based on the model and the most recently gathered sensor data.)
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Abolfathi and Mihai to incorporate the technique of providing a three-dimensional grid representing the surroundings of the vehicle, and the evaluated determined detection conditions being assigned to grid cells of the grid as taught by Murali with reasonable expectation of success such that the automated driving/assistance system can effectively select a driving path based on the detection and tracking of the physical objects by using cell values of a grid corresponding to a region surrounding a vehicle ( Murali [0062]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANA F ARTIMEZ/ Examiner, Art Unit 3667
/FARIS S ALMATRAHI/ Supervisory Patent Examiner, Art Unit 3667