DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-9 are presented for examination.
Claims 1-9 are rejected.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Response to Arguments
Applicant’s arguments, see page 7, filed 1/2/2025, with respect to the rejection(s) of claim(s) 1, 8, and 9 under 35 U.S.C. 102 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Meyhofer (US 20180272963 A1).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1-9 are rejected under 35 U.S.C. 103 as being unpatentable over Fritzsche (DE 102015005961 A1), in view of Meyhofer (US 20180272963 A1).
Regarding Claim 1, Fritzsche discloses a method for processing sensor data in a system that includes multiple sensors for detecting at least one subarea of surroundings around a platform [0022] “a vehicle 1 which is equipped with three sensors 2, 3, 4 as environmental sensors. This may, for example, be a radar 2, a mono or stereo camera 3 and a laser scanner or LIDAR 4.”, the method comprising the following steps: a) reading, at least partially in parallel by from the multiple sensors, respective sensor data streams of sensor data detected by respective ones of the multiple sensors [0009] “the object of the present invention to provide a method which is able to safely and reliably detect such a decalibration of a sensor from a network of at least two sensors so that suitable measures can be taken.” [0013] “the measurement signals of all sensors are compared in pairs, first with the measurement data of one sensor and then with the measurement data of the next sensor, after which it is determined which of the sensors has the decalibration.” [0022] “the data from the individual sensors 2, 3, 4 can be assigned to the same target object (i.e., in parallel)” The system reads at least two or three sensor data in parallel to compare data taken from all at the same time when assigned to the same target object. Fritzsche discloses a system wherein sensors are used and assigned by the system to detect the same object and then compared, it indicates reading in sensor data detected at in parallel. Fritzsche discloses reading data in parallel since multiple sensor readings are used to detect the same target object at the same time. b) checking, on the basis of the read-in sensor data, whether there is an at least partial impairment of a detection by of any one or more respective sensors of the multiple sensors [0008] “Impairment due to loose fastenings of the sensors and/or deformation, for example due to a collision or the like, are also conceivable as possible causes. None of the known methods can safely and reliably detect such an impairment, especially if it occurs during the operation of a vehicle equipped with the sensors.” [0009] “It is therefore the object of the present invention to provide a method which is able to safely and reliably detect such a de-calibration of a sensor from a network of at least two sensors so that suitable measures can be taken.” The system detects if one of the sensors is impaired. Fritzsche discloses identifying sensor impairments such as misalignment or mechanical failure. This confirms that checking for impairments is a function of the system in Fritzsche.
Fritzsche does not appear to teach the full claim limitation regrading “c) dynamically selecting, based on a result of the checking, one of the sensor data streams as a main sensor data stream, wherein all one or more remaining ones of the sensor data streams other than the main sensor data stream are selected as at least one respective secondary sensor data stream and d) during operation of the platform, controlling the platform based on the multiple data streams, wherein, based on the selecting, the main sensor data stream is prioritized over the at least one respective secondary sensor data stream for the control”
However, Meyhofer teaches equivalent teachings wherein c) dynamically selecting [0062] “an example method of operating a self-driving vehicle using dynamic sensor selection”, based on a result of the checking, one of the sensor data streams as a main sensor data stream [0033] “The sensor selection component 120 prioritizes the processing and/or use of sensor data 115 by (i) selecting one type of sensor data to the exclusion of another type of sensor data, and/or (ii) weighting sensor data by type so that sensor type may influence a determination of the SDV control system 100.” [0051] “the sensors, interfaces, or control system may detect faults or performance degradation in specific sensors, the compute stack of the vehicle, or from a mechanical system of the vehicle itself. Depending on the severity of the fault or degradation, the condition detection logic 230 can treat the sensor data 211 coming from any affected sensor as junk data to ignore or as degraded data to be given lower priority.”, wherein all one or more remaining ones of the sensor data streams other than the main sensor data stream are selected as at least one respective secondary sensor data stream [0034] “the sensor selection component 120 prioritizes, through either a weighting or selection process, each of the sensors 102 using a set of sensor priority rules that are based on expected performance characteristics of each of the sensors 102 in the detected conditions.” [0061] “Nevertheless, sensor data 211 from sensors that are not selected in the sensor priority 227 can still be collected, stored, and used for other purposes.” [0067] “Sensor prioritization logic 240 assigns priorities to sensor data from each sensor based on the retrieved sensor priority rules (330). In one aspect, the assigned priorities represent a set of values that weight the sensor data coming from each of the vehicle's sensors. For example, the sensor prioritization logic 240 may weight one sensor at 100% when that sensor is optimal (or superior to the other available sensors) in the current conditions. In another example, the sensor prioritization logic 240 may weight one sensor at 50% when that sensor is sub-optimal, but still provides partially reliable sensor data, in the current conditions.” Meyhofer system shows that if one sensor stream is effectively (i.e., main sensor data stream) (e.g., weighted 100%), then the remaining sensor streams that are “lower priority” (e.g., 50%, or otherwise reduced) correspond to secondary sensor data streams that still exist and can still contribute.
and d) during operation of the platform , controlling the platform based on the multiple data streams [0027] “the control system 100 includes computational resources (e.g., processing cores and/or field programmable gate arrays (FPGAs)) which operate to process sensor data 115 received from sensors 102 of the SDV 10 that provide a sensor view of a road segment upon which the SDV 10 operates (i.e., during operation).”, wherein, based on the selecting, the main sensor data stream is prioritized over the at least one respective secondary sensor data stream for the control [0014] “select a set of sensors and prioritize the sensor data generated from the selected set of sensors to control aspects relating to the operation of the SDV.” [0068] “Components can then apply the sensor priority weights or selections to generated sensor data in order to control aspects relating to the operation of the self-driving vehicle” [0033] “The sensor selection component 120 prioritizes the processing and/or use of sensor data 115 by (i) selecting one type of sensor data to the exclusion of another type of sensor data, and/or (ii) weighting sensor data by type so that sensor type may influence a determination of the SDV control system 100.”
It would have been obvious to a person that is skilled in the art before the effective filling date to combine Fritzsche and Meyhofer to make the system wherein dynamically selecting, based on a result of the checking, one of the sensor data streams as a main sensor data stream, wherein all one or more remaining ones of the sensor data streams other than the main sensor data stream are selected as at least one respective secondary sensor data stream and d) during operation of the platform, controlling the platform based on the multiple data streams, wherein, based on the selecting, the main sensor data stream is prioritized over the at least one respective secondary sensor data stream for the control.
A person that is skilled in the art would have been motivated to combine Fritzsche and Meyhofer to improve overall system reliability and accuracy [Meyhofer 0034] “Examples recognize that certain operating conditions present significant challenges to self-driving vehicles. In particular, weather such as fog, mist, rain, or snow can impair the ability of some of the sensors 102 to collect sensor data 115 with sufficient accuracy to reliably navigate the SDV 10 through an environment.”
Regarding Claim 2, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches wherein one or multiple of the sensors are camera sensors. [0022] “a vehicle 1 which is equipped with three sensors 2, 3, 4 as environmental sensors. This may, for example, be a radar 2, a mono or stereo camera 3 and a laser scanner or LIDAR 4.”
Regarding Claim 3, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches wherein the checking in step b) takes place based on a comparison of detections by different sensors of the sensors. [0011] “The method according to the invention uses the measurement signals from at least two sensors of a sensor network to determine the mean values of the deviation of the measurement signals from one another via a cross-comparison.”
Regarding Claim 4, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches further comprising: providing at least one piece of information about: a selected sensor data stream, and/or an established impairment of the detection, and/or a position of a sensor of the sensors for a main data path changed due to the selection. [0023] “In the case of minor deviations, appropriate recalibration is possible, and in the event of a continuous need for recalibration, a warning message can also be issued. In the event of a safety-critical deviation in the measurement signals of one of the sensors 2, 3, 4, the environment detection can be suspended and the autonomous driving of the vehicle can be switched off accordingly in order to continue to ensure safety.” The system issue warning based on impaired sensors data and can also suspend autonomous driving of the vehicle.
Regarding Claim 5, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches wherein the sensors are two optical sensors of a stereo camera. [0022] “a vehicle 1 which is equipped with three sensors 2, 3, 4 as environmental sensors. This may, for example, be a radar 2, a mono or stereo camera 3 and a laser scanner or LIDAR 4.” [0011] “This method can thus be used to safely and reliably detect a decalibration of one of the sensors, for example a squinting stereo camera whose optical axes are misaligned to each other” The system includes a stereo camera which includes two optical sensors.
Regarding Claim 6, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches wherein the sensor data or sensor data streams are processed at least partially separately from one another. [0013] “the measurement signals of all sensors are compared in pairs, first with the measurement data of one sensor and then with the measurement data of the next sensor, after which it is determined which of the sensors has the decalibration.”
Regarding Claim 7, The combination of Fritzsche with Meyhofer teaches the method as recited in claim 1, Fritzsche teaches wherein the system is a system for at least semi-assisted and/or automated driving. [0018] “Especially in such an application for autonomous driving of a vehicle, the safe and reliable function of the sensors and a safe and reliable assessment of their correct alignment play a crucial role.”
Regarding Claim 8, The claim recites a non-transitory machine-readable memory medium of the parallel limitations in claim 1, respectively for the reasons discussed above. Therefore, claim 8 is rejected using the same rational reasoning.
Regarding Claim 9, The claim recites a system of the parallel limitations in claim 1, respectively for the reasons discussed above. Therefore, claim 9 is rejected using the same rational reasoning.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HUSSAM ALZATEEMEH whose telephone number is (703)756-1013. The examiner can normally be reached 8:00-5:00 M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached on (571) 270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HUSSAM ALDEEN ALZATEEMEH/Examiner, Art Unit 3662
/ANISS CHAD/Supervisory Patent Examiner, Art Unit 3662