Prosecution Insights
Last updated: April 19, 2026
Application No. 18/152,857

METHOD FOR DETERMINING THE RELIABILITY OF OBJECTS

Non-Final OA §103
Filed
Jan 11, 2023
Examiner
CASS, JEAN PAUL
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Robert Bosch GmbH
OA Round
3 (Non-Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
719 granted / 984 resolved
+21.1% vs TC avg
Strong +26% interview lift
Without
With
+25.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
83 currently pending
Career history
1067
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
56.8%
+16.8% vs TC avg
§102
12.6%
-27.4% vs TC avg
§112
12.8%
-27.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 984 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to the Applicant’s appeal brief The previous rejection is withdrawn. Applicant’s brief is entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s remarks of a “redundancy metric” and a new non-final rejection is made herein. A new reference was found. A new rejection is made herein. Applicant’s arguments are now moot in view of the new rejection of the claims. PNG media_image1.png 780 624 media_image1.png Greyscale PNG media_image2.png 796 650 media_image2.png Greyscale Overby is silent but Tay teaches “...determining, for the each of the plurality of objects, a respective degree of redundancy of with which the respective object has been detected to be present in the surrounding environment; and (see FIG. 3 where a field of view is taken and then overlaps are taken within the field of view and then a redundancy in operation between the overlapping fields of view is also taken and then a threshold value of the redundancy in terms of a metric are taken and flags are provided for high or insufficient redundancy and then a turn can be made or not made and further imaging can be taken in blocks 110-170) d) controlling a drive of the vehicle based on the determined respective degrees of redundancies”. (see FIG. 1-3 where the autonomous vehicle will create a redundancy map to show objects that are labeled with a high redundancy and an insufficient redundancy and then the high redundancy metric can indicate a confident move while the second lower insufficient redundancy map can indicate a second metric for the navigation action and where a left turn is not made in block 162 and then further information is taken in block 170)”. It would have been obvious for one of ordinary skill in the art before the effective filing date to combine the disclosure of OVERBY and the teachings of TAY with a reasonable expectation of success since TAY can determine if 1. Radar 2. Camera 3 lidar all can detect the object and then a metric for “redundant” data is taken in the field of view and this metric is taken and compared to a threshold value in blocks 110-150. Then based on the redundant lidar data or camera data a left turn or a right turn can be delayed or taken if there is redundant and confident data based on the metric to improve safe operation. See paragraph 51-56 and claims 1-13. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-2 and 4-9 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent Application Pub. No.: 20190379683 A1 to Overby et al. that was filed in 2018 and which is prior to the effective filing date of 1-11-2023 and in view of U.S. Patent No.: US10572717B1 to Zhu filed in 2011 and assigned to WAYMO and in view of United States Patent Application Pub. No.: US 2019/0196481 A1 to Tay et al. that was filed in 2018 (hereinafter “TAY”). PNG media_image3.png 748 1168 media_image3.png Greyscale PNG media_image4.png 904 1522 media_image4.png Greyscale In regard to claim 1, and 8 and 9, Overby discloses “...1. A method for a vehicle operation, the method comprising the following steps: a) (see FIG. 12a where the vehicle has a LIDAR sensor 1264 and a radar sensor 1260 and a stereo camera and infrared camera an GNSS sensor and IMU sensor and ultrasonic sensor and rear radar, lidar and microphone and speed sensor, and brake sensor and a vibration sensor 1242) PNG media_image4.png 904 1522 media_image4.png Greyscale receiving sensor data from a plurality of sensors; (see paragraph 31 where the lidar and radar can provide for an autonomous operation or a semi-autonomous operation of the vehicle) PNG media_image5.png 794 870 media_image5.png Greyscale PNG media_image6.png 856 834 media_image6.png Greyscale PNG media_image7.png 862 1386 media_image7.png Greyscale b) for each of a plurality of objects, which the received sensor data indicates to have been detected by at least one of the plurality of sensors as being in a surrounding environment of the vehicle, associating with each of the plurality of sensors, a respective object existence indicator that represents whether the respective sensor is one of the sensors that has detected the respective one of the plurality of objects; and (see paragraph 94-103 and FIG. 1 where each device 104a-n can provide inputs to the environment 102 and a crypto engine can encrypt the data and detect a threat in the packets 134 and then if acceptable and send this to the ecus 108 as secured data; see paragraph 47-50 and 130-131 and 50-55 and 56-62 where the can controller can monitor the messages to determine a threat using a threat detector of the attack on the autonomous vehicle) (see paragraph 33 and 194) (see FIG. 13 where a processor 1331, and 1311 and 1321 where the processors can authenticate the sensor data and encrypt the sensor data) (see FIG 1. Where this is considered sensor LIDAR or radar data in block 106)(See paragraph 104 where the increased latency can indicate an attack and then an increased escalation level can be provided where network resources are limited and see paragraph 145 where the arbitration of the messages can be increased and see paragraph 96 where the sensor data can be fused from a LIDAR, radar, or accelerometer, or GPS and other vehicle sensors and see paragraph 109 and 124 where the sensor data may include an abnormal size. For example, if the received authenticated sensor data 1312, 1322 is not configured in an expected format (e.g., ASCII, LAS, CIFF, JPEG, RAW, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. In an embodiment, if the received authenticated sensor data 1312, 1322 is not configured in an expected size (e.g., Mbs, Gbs, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. Thus, the third cryptographic coprocessors 1331 can use several expected features of the authenticated sensor data 1312, 1322 to ensure that the authenticated sensor data 1312, 1322 is being received from the sensors 1310, 1320, and not an outside threat. In an embodiment, once the third cryptographic coprocessor 1331 validates the authenticated sensor data 1312, 1322, the computer processors 1330 parse the authenticated sensor data 1312, 1322 for further processing. In an embodiment, the computer processors 1330 parse the authenticated sensor data 1312, 1322 by organizing the sensor data in accordance with at least one parsing rule. For example, the computer processors 1330 organize the sensor data 1312, 1322 in accordance with sensor type, file type, file size, data type, object type, and so forth.) PNG media_image8.png 722 1068 media_image8.png Greyscale Overby is silent but Zhu teaches ‘...c) based on the associated object existence indicators, (see FIG. 10, where the raw camera image, the raw laser image, and raw radar data all 1012-1016 provide the capturing of the objects and then provide a label 1026-1036; see col. 14, lines 1- col. 16, line 25 where each of the sensors can all capture the object but also sometimes one sensor may completely miss the object and is not redundant and may miss 3 percent of ALL of the objects”. (see col. 17, lines 1-24 where an optimization procedure can be used to correct the bad sensor or use the other sensors more; since the object sensor keeps missing objects)”. It would have been obvious for one of ordinary skill in the art before the effective filing date to combine the disclosure of OVERBY and the teachings of ZHU with a reasonable expectation of success since ZHU can determine if 1. Radar 2. Camera 3 lidar all can detect the object or if one sensor is missing the object and the processor can then perform an optimization procedure based on the missing object detection. This may introduce a reviewer to detect objects or use a more accurate sensor. See abstract and claims 1-20 of Zhu. PNG media_image1.png 780 624 media_image1.png Greyscale PNG media_image2.png 796 650 media_image2.png Greyscale Overby is silent but Tay teaches “...determining, for the each of the plurality of objects, a respective degree of redundancy of with which the respective object has been detected to be present in the surrounding environment; and (see FIG. 3 where a field of view is taken and then overlaps are taken within the field of view and then a redundancy in operation between the overlapping fields of view is also taken and then a threshold value of the redundancy in terms of a metric are taken and flags are provided for high or insufficient redundancy and then a turn can be made or not made and further imaging can be taken in blocks 110-170) d) controlling a drive of the vehicle based on the determined respective degrees of redundancies”. (see FIG. 1-3 where the autonomous vehicle will create a redundancy map to show objects that are labeled with a high redundancy and an insufficient redundancy and then the high redundancy metric can indicate a confident move while the second lower insufficient redundancy map can indicate a second metric for the navigation action and where a left turn is not made in block 162 and then further information is taken in block 170)”. It would have been obvious for one of ordinary skill in the art before the effective filing date to combine the disclosure of OVERBY and the teachings of TAY with a reasonable expectation of success since TAY can determine if 1. Radar 2. Camera 3 lidar all can detect the object and then a metric for “redundant” data is taken in the field of view and this metric is taken and compared to a threshold value in blocks 110-150. Then based on the redundant lidar data or camera data a left turn or a right turn can be delayed or taken if there is redundant and confident data based on the metric to improve safe operation. See paragraph 51-56 and claims 1-13. Overby discloses “...2. The method as recited in claim 1, wherein the object existence indicators are binary indicator”. (See paragraph 104 where the increased latency can indicate an attack and then an increased escalation level can be provided where network resources are limited and see paragraph 145 where the arbitration of the messages can be increased and see paragraph 96 where the sensor data can be fused from a LIDAR, radar, or accelerometer, or GPS and other vehicle sensors and see paragraph 109 and 124 where the sensor data may include an abnormal size. For example, if the received authenticated sensor data 1312, 1322 is not configured in an expected format (e.g., ASCII, LAS, CIFF, JPEG, RAW, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. In an embodiment, if the received authenticated sensor data 1312, 1322 is not configured in an expected size (e.g., Mbs, Gbs, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. Thus, the third cryptographic coprocessors 1331 can use several expected features of the authenticated sensor data 1312, 1322 to ensure that the authenticated sensor data 1312, 1322 is being received from the sensors 1310, 1320, and not an outside threat. In an embodiment, once the third cryptographic coprocessor 1331 validates the authenticated sensor data 1312, 1322, the computer processors 1330 parse the authenticated sensor data 1312, 1322 for further processing. In an embodiment, the computer processors 1330 parse the authenticated sensor data 1312, 1322 by organizing the sensor data in accordance with at least one parsing rule. For example, the computer processors 1330 organize the sensor data 1312, 1322 in accordance with sensor type, file type, file size, data type, object type, and so forth.) Claim 3 is cancelled. Overby discloses “...4. The method as recited in claim 1, wherein the object existence indicators are provided as a predefined number of bits”. (See paragraph 104 where the increased latency can indicate an attack and then an increased escalation level can be provided where network resources are limited and see paragraph 145 where the arbitration of the messages can be increased and see paragraph 96 where the sensor data can be fused from a LIDAR, radar, or accelerometer, or GPS and other vehicle sensors and see paragraph 109 and 124 where the sensor data may include an abnormal size. For example, if the received authenticated sensor data 1312, 1322 is not configured in an expected format (e.g., ASCII, LAS, CIFF, JPEG, RAW, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. In an embodiment, if the received authenticated sensor data 1312, 1322 is not configured in an expected size (e.g., Mbs, Gbs, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. Thus, the third cryptographic coprocessors 1331 can use several expected features of the authenticated sensor data 1312, 1322 to ensure that the authenticated sensor data 1312, 1322 is being received from the sensors 1310, 1320, and not an outside threat. In an embodiment, once the third cryptographic coprocessor 1331 validates the authenticated sensor data 1312, 1322, the computer processors 1330 parse the authenticated sensor data 1312, 1322 for further processing. In an embodiment, the computer processors 1330 parse the authenticated sensor data 1312, 1322 by organizing the sensor data in accordance with at least one parsing rule. For example, the computer processors 1330 organize the sensor data 1312, 1322 in accordance with sensor type, file type, file size, data type, object type, and so forth.) Zhu teaches “..5. (Currently Amended) The method as recited in claim 1, further comprising determining an object recognition reliability based on the determined degree of redundancy, wherein the controlling is performed based on object reliability determination. (see col. 14, line 1 to col. 17, line 25). It would have been obvious for one of ordinary skill in the art before the effective filing date to combine the disclosure of OVERBY and the teachings of ZHU with a reasonable expectation of success since ZHU can determine if 1. Radar 2. Camera 3 lidar all can detect the object or if one sensor is missing the object and the processor can then perform an optimization procedure based on the missing object detection. This may introduce a reviewer to detect objects or use a more accurate sensor. See abstract and claims 1-20 of Zhu. Overby discloses “...5. The method as recited in claim 1, wherein the redundancy of the pieces of object existence information is considered in a determination of at least one value that describes the reliability of the object recognition”. (See paragraph 5 where the data is not so called “expected traffic” from the lidar sensor or radar sensor and see paragraph 104 where the increased latency can indicate an attack and then an increased escalation level can be provided where network resources are limited and see paragraph 145 where the arbitration of the messages can be increased and see paragraph 96 where the sensor data can be fused from a LIDAR, radar, or accelerometer, or GPS and other vehicle sensors and see paragraph 109 and 124 where the sensor data may include an abnormal size. For example, if the received authenticated sensor data 1312, 1322 is not configured in an expected format (e.g., ASCII, LAS, CIFF, JPEG, RAW, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. In an embodiment, if the received authenticated sensor data 1312, 1322 is not configured in an expected size (e.g., Mbs, Gbs, etc.) the third cryptographic coprocessor 1331 does not validate the authenticated sensor data 1312, 1322. Thus, the third cryptographic coprocessors 1331 can use several expected features of the authenticated sensor data 1312, 1322 to ensure that the authenticated sensor data 1312, 1322 is being received from the sensors 1310, 1320, and not an outside threat. In an embodiment, once the third cryptographic coprocessor 1331 validates the authenticated sensor data 1312, 1322, the computer processors 1330 parse the authenticated sensor data 1312, 1322 for further processing. In an embodiment, the computer processors 1330 parse the authenticated sensor data 1312, 1322 by organizing the sensor data in accordance with at least one parsing rule. For example, the computer processors 1330 organize the sensor data 1312, 1322 in accordance with sensor type, file type, file size, data type, object type, and so forth.) Overby discloses “...6. The method as recited in claim 1, wherein the plurality of sensors includes one or multiple sensor modalities. (see paragraph 140 where the CAN message can be an unpermitted message and then a mode selector 140 can change the mode from a first mode to a safe mode without impacting the overall functioning of the vehicle) Zhu teaches “...7. (Currently Amended) The method as recited in claim 1, wherein the objects for which the respective degrees of redundancy are determined are detected by various sensor modalities that detect the a same detection area”. (see col. 14, line 1 to col. 17, line 25). It would have been obvious for one of ordinary skill in the art before the effective filing date to combine the disclosure of OVERBY and the teachings of ZHU with a reasonable expectation of success since ZHU can determine if 1. Radar 2. Camera 3 lidar all can detect the object or if one sensor is missing the object and the processor can then perform an optimization procedure based on the missing object detection. This may introduce a reviewer to detect objects or use a more accurate sensor. See abstract and claims 1-20 of Zhu. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached on 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEAN PAUL CASS/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Jan 11, 2023
Application Filed
Oct 17, 2024
Non-Final Rejection — §103
Feb 24, 2025
Response Filed
May 12, 2025
Final Rejection — §103
Jul 28, 2025
Response after Non-Final Action
Sep 15, 2025
Notice of Allowance
Sep 15, 2025
Response after Non-Final Action
Sep 29, 2025
Response after Non-Final Action
Oct 08, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593752
SYSTEM AND METHOD FOR CONTROLLING HARVESTING IMPLEMENT OPERATION OF AN AGRICULTURAL HARVESTER BASED ON TILT ACTUATOR FORCE
2y 5m to grant Granted Apr 07, 2026
Patent 12596986
GLOBAL ADDRESS SYSTEM AND METHOD
2y 5m to grant Granted Apr 07, 2026
Patent 12590801
REAL TIME DETERMINATION OF PEDESTRIAN DIRECTION OF TRAVEL
2y 5m to grant Granted Mar 31, 2026
Patent 12583572
MARINE VESSEL AND MARINE VESSEL PROPULSION CONTROL SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12571183
EXCAVATOR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+25.9%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 984 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month