Prosecution Insights
Last updated: April 19, 2026
Application No. 17/434,721

AUTONOMOUS VEHICLE SYSTEM

Final Rejection §103
Filed
Aug 27, 2021
Examiner
BUTLER, RODNEY ALLEN
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Intel Corporation
OA Round
5 (Final)
88%
Grant Probability
Favorable
6-7
OA Rounds
2y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 88% — above average
88%
Career Allow Rate
851 granted / 965 resolved
+36.2% vs TC avg
Moderate +11% lift
Without
With
+11.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 2m
Avg Prosecution
34 currently pending
Career history
999
Total Applications
across all art units

Statute-Specific Performance

§101
15.6%
-24.4% vs TC avg
§103
41.7%
+1.7% vs TC avg
§102
18.2%
-21.8% vs TC avg
§112
18.5%
-21.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 965 resolved cases

Office Action

§103
DETAILED ACTION Status of the Application The present application is being examined under the pre-AIA first to invent provisions. Status of the Claims This action is in response to the applicant’s filing on January 12, 2026. Claims 32 – 55 and 57 – 58 are pending and examined below. Response to Arguments Applicant' s arguments with respect to claims 32 – 55 and 57 – 58 have been considered but are moot because the arguments do not apply to the new combination of references used in the current rejection. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 32 – 34, 44 – 45, 47 and 55 are rejected under 35 U.S.C. 103 as being unpatentable over the Sobhany publication in view of U.S. Patent Application Publication No. 2020/0114907 A1 to Sims et al. (herein after “Sims et al. publication") and U.S. Patent No. 10,671,068 B1 to Xu et al. (herein after “Xu et al. publication"). Note: Text written in bold typeface is claim language from the instant application. Texts written in normal typeface are comments made by the Examiner and/or passages from the prior art reference(s). As to claims 32, 33, 44 – 45 and 55, the Sobhany publication discloses an apparatus (310) comprising: an interface to receive sensor data from a plurality of sensors of an autonomous vehicle (see Abstract for “a sensor interface communicatively coupled to a plurality of sensors in the vehicle and a vehicle experience system”), the sensor data comprising first sensor data from a first type of sensor and second sensor data from a second type of sensor (see ¶42 for “example sensors 215 includ[ing] internal or external cameras, eye tracking sensors, temperature sensors, audio sensors, weight sensors in a seat 230, force sensors measuring force applied to devices such as a steering wheel or display 205, accelerometers, gyroscopes, light detecting and ranging (LIDAR) sensors, or infrared sensors”); and processing circuitry coupled to the interface (see ¶31, where “[t]he vehicle 110 can process the sensor data”; see ¶48, where “the vehicle experience system 310 can include programmable circuitry (e.g., one or more microprocessors), programmed with software and/or firmware, entirely in special-purpose hardwired (i.e., non-programmable) circuitry, or in a combination or such forms”; see ¶53, where “the vehicle experience system 310 can include . . . a processing engine 330”), the processing circuitry to: abstract each of the first sensor data and the second sensor data to produce abstracted first sensor data and abstracted second sensor data, wherein the processing circuitry is to abstract the sensor data by: normalizing sensor response values of the sensor data (see ¶53, where the vehicle experience system 310 can include a sensor abstraction component 312; see also ¶54, where “[t]he sensor abstraction component 312 receives raw sensor data from the car network 350 and/or other data sources 355 and normalizes the inputs for processing by the processing engine 330”)(Emphasis added); and use the abstracted first sensor data and abstracted second sensor data in a perception phase of a control process for the autonomous vehicle, wherein the perception phase utilizes a common perception model. (See ¶54, where “[t]he sensor abstraction component 312 may be adaptable to multiple vehicle models and can be readily updated as new sensors are made available.”) The Sobhany publication, however, fails to specifically disclose the processing circuitry abstracting the sensor data by: warping the sensor data; and filtering the sensor data; and, combining the abstracted first sensor data and the abstracted second sensor data; and using the combined abstracted first sensor data and abstracted second sensor data in a perception phase of a control process for the autonomous vehicle, wherein the perception phase utilizes a common perception model. The Xu et al. publication discloses that sensor data can be captured by different sensors and shared across different sensor processing pipelines. (See Abstract.) The different sensor data shared across different sensor processing pipelines can be used to determine a perception decision based on the captured sensor data. Sensor data may be shared among multiple processing pipelines at different stages of the processing pipelines prior to determining a perception decision. In this way, combined sensor data may be processed by some processing pipelines to reach a perception decision for the processing pipeline. The perception decisions of the different processing pipelines may also be combined to generate a final perception decision. (See Col. 1, lns 30 – 41.) The Xu et al. publication also discloses that “[c]ombining sensor data may also result in transforming the data (e.g., averaging values, correcting, warping, or skewing sensor data according to other sensor data). In some embodiments, the resolution of sensor data may be changed (e.g., by down sampling, blurring, or filtering the sensor data). Combining sensor data may also be implemented by pairing, registering, co-locating, or otherwise annotating one set of sensor data by another.” (See Col. 11, ln 63 through Col. 12, ln 4.)(Emphasis added.) Such disclosure suggests the processing circuitry abstracting the sensor data by warping the sensor data and filtering the sensor data. Based on a reasonable expectation of success, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the Sobhany publication so that the processing circuitry abstracts the sensor data by warping the sensor data and filtering the sensor data, as suggested by the Xu et al. publication, in order to leverage multiple perception decisions to facilitate determining the control actions performed. The modified Sobhany publication discloses the invention substantially as claimed, except for combining the abstracted first sensor data and the abstracted second sensor data; and using the combined abstracted first sensor data and abstracted second sensor data in a perception phase of a control process for the autonomous vehicle, wherein the perception phase utilizes a common perception model. Combining or fusing perception data gathered by a first sensor and perception data gathered by a second sensor is old and well-known, as demonstrated by the Xu et al. publication, as discussed herein above, and/or the Sims et al. publication whose “[automated driving system or] ADS 24 includes multiple distinct control systems, including at least a perception system 32 for determining the presence, location, classification, and path of detected features or objects in the vicinity of the vehicle. The perception system 32 is configured to receive inputs from a variety of sensors [all mounted on the same vehicle 12], such as the sensors 26 illustrated in FIG. 1, and synthesize and process the sensor inputs to generate parameters used as inputs for other control algorithms of the ADS 24. The perception system 32 includes a sensor fusion and preprocessing module 34 that processes and synthesizes sensor data 27 from the variety of sensors 26. The sensor fusion and preprocessing module 34 performs calibration of the sensor data 27, including, but not limited to, LIDAR to LIDAR calibration, camera to LIDAR calibration, LIDAR to chassis calibration, and LIDAR beam intensity calibration. The sensor fusion and preprocessing module 34 outputs preprocessed sensor output 35.” (See FIGS. 1 – 2 and ¶33 et seq.)(Emphasis added.) Such disclosure suggests combining the abstracted first sensor data and the abstracted second sensor data, and using the combined abstracted first sensor data and abstracted second sensor data in a perception phase of a control process for the autonomous vehicle, wherein the perception phase utilizes a common perception model. Based on a reasonable expectation of success, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to further modify the Sobhany publication to combine the abstracted first sensor data and the abstracted second sensor data, and use the combined abstracted first sensor data and abstracted second sensor data in a perception phase of a control process for the autonomous vehicle, wherein the perception phase utilizes a common perception model, as suggested by the Xu et al. publication and/or the Sims et al. publication, in order to facilitate identifying objects in the environment and support navigation of an autonomous vehicle. As to claims 34 and 47, the Sobhany publication discloses the processing circuitry is to: abstract the sensor data to produce first abstracted sensor data corresponding to the first sensor data and second abstracted sensor data corresponding to the second sensor data, wherein the processing circuitry is to abstract the sensor data by one or more of: normalizing sensor response values for each of the first sensor data and the second sensor data (see ¶53 – ¶54); warping each of the first sensor data and the second sensor data; and filtering each of the first sensor data and the second sensor data; and fusing the first and second abstracted sensor data, wherein the fused first and second abstracted sensor data are used in the perception phase (see ¶53 – ¶54 and ¶64 – ¶65). Claims 35, 36, 48 – 49 and 57 – 58 are rejected under 35 U.S.C. 103 as being unpatentable over the Sobhany publication in view of the Sims et al. publication and the Xu et al. publication, and further in view of U.S. Patent No. 10,442,444 B1 to Bando et al. (herein after “Bando et al. publication"). Note: Text written in bold typeface is claim language from the instant application. Texts written in normal typeface are comments made by the Examiner and/or passages from the prior art reference(s). As to claims 35 and 48, the modified Sobhany publication discloses the invention substantially as claimed, except for the processing circuitry to normalize sensor response values by one or more of normalizing pixel values of an image, normalizing a bit depth of an image, normalizing a color space of an image, and normalizing a range of depth or distance values in lidar data. Normalization of sensor response values is old and well-known, as demonstrated by the Bando et al. patent who discloses “[a] normalization module 402 [that] normalizes the received sensor signals and/or sensor data to provide normalized signals based on pre-fixed normalization parameters, such as the normalization parameters stored in the normalization parameter database 412.” (See Col. 7, lns 38 – 41.) Such disclosure suggests processing circuitry to normalize sensor response values by one or more of normalizing pixel values of an image, normalizing a bit depth of an image, normalizing a color space of an image, and normalizing a range of depth or distance values in lidar data. Based on a reasonable expectation of success, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to further modify and provide the Sobhany publication with a normalization module so that the processing circuitry normalizes sensor response values by one or more of normalizing pixel values of an image, normalizing a bit depth of an image, normalizing a color space of an image, and normalizing a range of depth or distance values in lidar data, as suggested by the Bando et al. publication, in order to facilitate identifying objects in the environment and support navigation of an autonomous vehicle. As to claims 36 and 49, the Sobhany publication, as modified by Col. 7, lns 38 – 41 of the Bando et al. publication, is considered to disclose the processing circuitry normalizing sensor response values based on one or more sensor response models for the plurality of sensors. As to claim 57, the Sobhany publication is considered to disclose the first sensor data being camera data and the second sensor data is lidar data. (See ¶42 for “example sensors 215 include[ing] internal or external cameras, eye tracking sensors, temperature sensors, audio sensors, weight sensors in a seat 230, force sensors measuring force applied to devices such as a steering wheel or display 205, accelerometers, gyroscopes, light detecting and ranging (LIDAR) sensors, or infrared sensors.”) As to claim 58, the Sobhany publication is considered to disclose the first sensor data being camera data from a first type of camera and the second sensor data being camera data from a second type of camera. (See ¶42.) Claims 37 – 38 and 50 – 51 are rejected under 35 U.S.C. 103 as being unpatentable over the Sobhany publication in view of the Sims et al. publication and the Xu et al. publication, and further in view of the cited Browning et al. publication. Note: Text written in bold typeface is claim language from the instant application. Texts written in normal typeface are comments made by the Examiner and/or passages from the prior art reference(s). As to claims 37 – 38 and 50 – 51, the modified Sobhany publication discloses the invention substantially as claimed, except for the processing circuitry warping the sensor data by performing one or more of a spatial upscaling operation, a downscaling operation, a correction process for geometric effects associated with the sensor, and a correction process for motion of the sensor, and warping the sensor data based on sensor configuration information for the plurality of sensors. Warping sensor data is old and well-known, as demonstrated by the Browning et al. publication who discloses an image processor 1038 that uses a warping logic 1063, which includes rules or models to alter or skew dimensions of detected objects. (See ¶142 – ¶161.) Based on a reasonable expectation of success, it would have been an obvious exercise of ordinary skill in the art before the effective filing date of the claimed invention to further modify the processing circuitry of the Sobhany publication to warp the sensor data by performing one or more of a spatial upscaling operation, a downscaling operation, a correction process for geometric effects associated with the sensor, and perform a correction process for motion of the sensor, and warping the sensor data based on sensor configuration information for the plurality of sensors, as suggested by the Browning et al. publication, in order to facilitate identifying and locating objects in the environment and support navigation of an autonomous vehicle. Claims 39 – 43 and 52 – 54 are rejected under 35 U.S.C. 103 as being unpatentable over the Sobhany publication in view of the Sims et al. publication and the Xu et al. publication, and further in view of U.S. Patent Application Publication No. 2017/0109644 to Nariyambut Murali et al. (herein after “Nariyambut Murali et al. publication"). Note: Text written in bold typeface is claim language from the instant application. Texts written in normal typeface are comments made by the Examiner and/or passages from the prior art reference(s). As to claims 39 – 43 and 52 – 54, the modified Sobhany publication discloses the invention substantially as claimed, except for the processing circuitry filtering the sensor data by applying one or more of a Kalman filter, a variant of the Kalman filter, a particle filter, a histogram filter, an information filter, a Bayes filter, and a Gaussian filter; filtering the sensor data based on one or more of sensor noise models for the plurality of sensors and a scene model; filtering the sensor data determining a validity of the sensor data and discarding the sensor data in response to determining that the sensor data is invalid; filtering the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is below a confidence threshold; and filtering the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is outside a range of values. Filtering sensor data using various techniques to establish the validity of the sensor data or set a threshold is very old and well-known in the art, as demonstrated by at least the Nariyambut Murali et al. publication who discloses “[that] sensor fusion may use a Kalman filter, a particle filter, a WISH algorithm, and/or deep learning algorithms to produce inferred values for object detection, speed limits, driving behavior decisions, probability or confidence values, or other values that may be helpful for an automated driving/assistance system 102.” Such disclosure suggests the processing circuitry is to filter the sensor data by applying one or more of a Kalman filter, a variant of the Kalman filter, a particle filter, a histogram filter, an information filter, a Bayes filter, and a Gaussian filter; filtering the sensor data based on one or more of sensor noise models for the plurality of sensors and a scene model; filtering the sensor data determining a validity of the sensor data and discarding the sensor data in response to determining that the sensor data is invalid; filtering the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is below a confidence threshold; and filtering the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is outside a range of values. Based on a reasonable expectation of success, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to further modify and provide the Sobhany publication with processing circuitry to filter the sensor data by applying one or more of a Kalman filter, a variant of the Kalman filter, a particle filter, a histogram filter, an information filter, a Bayes filter, and a Gaussian filter, filter the sensor data based on one or more of sensor noise models for the plurality of sensors and a scene model, filter the sensor data determining a validity of the sensor data and discarding the sensor data in response to determining that the sensor data is invalid, filter the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is below a confidence threshold, and filter the sensor data by determining a confidence level of the sensor data and discarding sensor data in response to determining that the sensor data is outside a range of values, as suggested by the Nariyambut Murali et al., in order to facilitate identifying objects in the environment and support navigation of an autonomous vehicle. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Examiner's Note(s): The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. SEE MPEP 2141.02 [R-07.2015] VI. PRIOR ART MUST BE CONSIDERED IN ITS ENTIRETY, INCLUDING DISCLOSURES THAT TEACH AWAY FROM THE CLAIMS: A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123. In addition, disclosures in a reference must be evaluated for what they would fairly teach one of ordinary skill in the art. See In re Snow, 471 F.2d 1400, 176 USPQ 328 (CCPA 1973) and In re Boe, 355 F.2d 961, 148 USPQ 507 (CCPA 1966). Specifically, in considering the teachings of a reference, it is proper to take into account not only the specific teachings of the reference, but also the inferences that one skilled in the art would reasonably have been expected to draw from the reference. See In re Preda, 401 F.2d 825, 159 USPQ 342 (CCPA 1968) and In re Shepard, 319 F.2d 194, 138 USPQ 148 (CCPA 1963). Likewise, it is proper to take into consideration not only the teachings of the prior art, but also the level of ordinary skill in the art. See In re Luck, 476 F.2d 650, 177 USPQ 523 (CCPA 1973). Specifically, those of ordinary skill in the art are presumed to have some knowledge of the art apart from what is expressly disclosed in the references. See In re Jacoby, 309 F.2d 513, 135 USPQ 317 (CCPA 1962). Any inquiry concerning this communication or earlier communications from the examiner should be directed to RODNEY A. BUTLER whose telephone number is (313)446-6513. The examiner can normally be reached on weekdays, Monday through Friday, between 9 a.m. and 5 p.m. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne M. Antonucci can be reached on weekdays, Monday through Friday, between 9 a.m. and 5 p.m. at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. Electronic Communications Prior to initiating the first e-mail correspondence with any examiner, Applicant is responsible for filing a written statement with the USPTO in accordance with MPEP § 502.03 II. All received e-mail messages including e-mail attachments shall be placed into this application’s record. /RODNEY A BUTLER/Primary Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Aug 27, 2021
Application Filed
Feb 03, 2024
Non-Final Rejection — §103
Jun 12, 2024
Response Filed
Jul 26, 2024
Final Rejection — §103
Oct 29, 2024
Interview Requested
Nov 18, 2024
Applicant Interview (Telephonic)
Nov 18, 2024
Examiner Interview Summary
Feb 03, 2025
Request for Continued Examination
Feb 04, 2025
Response after Non-Final Action
Feb 20, 2025
Non-Final Rejection — §103
May 27, 2025
Response Filed
Sep 08, 2025
Non-Final Rejection — §103
Jan 12, 2026
Response Filed
Jan 26, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602050
Method of Operating A Printing Robot In Shadows
2y 5m to grant Granted Apr 14, 2026
Patent 12600231
VEHICLE DISPLAY SYSTEM, VEHICLE DISPLAY METHOD, AND STORAGE MEDIUM STORING VEHICLE DISPLAY PROGRAM
2y 5m to grant Granted Apr 14, 2026
Patent 12602051
REMOTE SUPPORT SYSTEM AND MOBILE BODY
2y 5m to grant Granted Apr 14, 2026
Patent 12599054
AGRICULTURAL WORK ASSISTANCE SYSTEM, AGRICULTURAL MACHINE, AND AGRICULTURAL WORK ASSISTANCE DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12589655
BATTERY MANAGEMENT SYSTEM AND VEHICLE
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

6-7
Expected OA Rounds
88%
Grant Probability
99%
With Interview (+11.1%)
2y 2m
Median Time to Grant
High
PTA Risk
Based on 965 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month