Prosecution Insights
Last updated: April 19, 2026
Application No. 18/579,465

DRIVING ASSISTANCE SYSTEM AND VEHICLE

Final Rejection §103
Filed
Jan 15, 2024
Examiner
TARKO, ASMAMAW G
Art Unit
2482
Tech Center
2400 — Computer Networks
Assignee
ArcSoft Corporation Limited
OA Round
2 (Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
3y 0m
To Grant
81%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
284 granted / 395 resolved
+13.9% vs TC avg
Moderate +9% lift
Without
With
+9.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
24 currently pending
Career history
419
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
58.2%
+18.2% vs TC avg
§102
23.9%
-16.1% vs TC avg
§112
4.4%
-35.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 395 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Remarks This communication is in response to the Applicant’s Amendment filed on 12/12/2025. Claims 1-19 were pending. Independent claim 1 and dependent claim 4 are amended. Claims 1-19 remains pending. Objection to claim 19 is withdrawn in view of Applicant remark on clarifying that claim 19 should be treated as a dependent claim. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-12 and 14-19 are rejected under 35 U.S.C. 103 as being unpatentable over HIROKI (US 20200278743 A1, hereinafter “HIROKI”) in view of Kondiparthi et al. (US 20230162464 A1, hereinafter “Kondiparthi”). Regarding claim 1. (Currently Amended) HIROKI discloses a driving assistance system, comprising a vehicle attitude detection unit, camera units, a processing unit, and a display unit, wherein the vehicle attitude detection unit is configured to detect a driving state of a vehicle in real time after the vehicle is started and obtain a vehicle attitude detection signal (0006, 0018, 0023-0025 and 0040; Figures 1-5); the processing unit has a first interface, a second interface, and a third interface; the processing unit is connected to the vehicle attitude detection unit through the first interface and is configured to receive the vehicle attitude detection signal outputted by the vehicle attitude detection unit and receive, through the second interface, image data obtained by the camera units corresponding to the vehicle attitude detection signal (0018 and 0049; Figures 1 and 5); and the display unit is connected to the third interface of the processing unit and is configured to receive and display the image data (0018 and 0049; Figures 1 and 5). HIROKI failed to disclose driving assistance system wherein the vehicle attitude detection signal comprises a vertical angle, and wherein, in response to a left- or right-turn driving state, the processing unit stitches side and front image data, and in response to a reversing state, stitches rear and side image data. Kondiparthi, however, in the same field of endeavor, shows a driving assistance system wherein the vehicle attitude detection signal comprises a vertical angle (0058 and 0038), and wherein, in response to a left- or right-turn driving state, the processing unit stitches side and front image data, and in response to a reversing state, stitches rear and side image data (0028; “[0028] … if a vehicle is about to make a left turn, it may be desirable to produce a stitched image from component images acquired from vehicle front 101 and left-side 102 cameras with a virtual camera pose located above the vehicle in order to more clearly illustrate an intersection to the driver. Similar stitched images from pairs of images can be generated when a vehicle is about to turn right or reverse around a corner.”). It would have been obvious to the person of having ordinary skilled in the art before the effective filing date of the invention to combine Kondiparthi image stitching method corresponding to the attitude of the vehicle with vehicle control device of HIROKI in order to the detect the vehicles attitude effectively and efficiently and provide accurate imagery of the surrounding of the vehicle as well as yield predictive result. Regarding claim 2. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the processing unit further comprises a fourth interface and is configured to generate an enabling signal based on the vehicle attitude detection signal, send the enabling signal to the camera units through the fourth interface, and control the camera units to be enabled (0018-0021 and 0049-0050; Figures 1 and 5). Regarding claim 3. (Previously Presented) HIROKI discloses the driving assistance system according to claim 1, wherein the driving assistance system further comprises a blind zone detection unit configured to determine a blind zone position of the vehicle in light of the vehicle attitude detection signal and obtain a blind zone detection signal (0019-0022 and 0029; Figures 1-3); and the processing unit determines the blind zone position based on the blind zone detection signal and receives, through the second interface, image data corresponding to the blind zone position and obtained by the camera units (0019-0022 and 0029; Figures 1-3). Regarding claim 4. (Currently Amended) HIROKI discloses the driving assistance system according to claim 1, wherein the vehicle attitude detection signal further comprises at least one of: or a driving speed of the vehicle (0024, 0032 and 0053; Figures 2A-2B, 3 and 5). Regarding claim 5. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the processing unit is further configured to compare the vehicle attitude detection signal with a preset value and receive the image data obtained by the camera units corresponding to the vehicle attitude detection signal based on a comparison result (0032, 0040 and 0042; Figure 3). Regarding claim 6. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the vehicle attitude detection unit, the processing unit, the camera units, and the display unit are discrete components, partially integrated components, or completely integrated components (0051; Figure 5). Regarding claim 7. (Previously Presented) HIROKI discloses the driving assistance system according to claim 1, wherein the processing unit further renders the image data and sends the processed image data to the display unit through the third interface for display (0018, 0025, 0035 and 0049; Figures 1 and 5). Regarding claim 8. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the camera units are located on at least one of following positions of the vehicle: a front side, a rear side, a left side, or a right side (0019-0020; Figure 1). Regarding claim 9. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the processing unit synthesizes image data obtained by the camera units in at least two adjacent positions (0019-0020 and 0051; Figure 1). Regarding claim 10. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the vehicle attitude detection signal represents at least one of following driving states of the vehicle: straight driving, uphill driving, downhill driving, left turning, right turning, reversing, or parked (0024, 0032 and 0040; Figures 2A-2B and 3). Regarding claim 11. (Previously Presented) HIROKI discloses the driving assistance system according to claim 1, wherein the driving assistance system further comprises a sight line detection unit configured to detect a sight line direction and/or a sight point position of a driver and obtain a sight line detection signal (0028-0029, 0038 and 0040; Figure 3); and the processing unit generates a display unit enabling signal based on the sight line detection signal and controls the display unit to be enabled (0028-0029, 0038 and 0040; Figure 3). Regarding claim 12. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the vehicle attitude detection unit comprises at least one of: an inertial sensor, a camera (0018-0021, 0031-0032 and 0050; Figures 1-3 and 5), an infrared sensor, a radar (0032; Figure 3), a laser radar, or a GPS (0023 and 0031; Figures 2-3). Regarding claim 14. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein the processing unit detects an obstacle in the image data to obtain an obstacle detection result (0040; Figure 3). Regarding claim 15. (Original) HIROKI discloses the driving assistance system according to claim 14, wherein the obstacle detection result is transmitted to the display unit for display or is transmitted to an alarm apparatus for alert (0024 and 0040; Figures 2-3). Regarding claim 16. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein content displayed by the display unit contains a distance and a direction indication (0024 and 0032; Figures 2-3). Regarding claim 17. (Original) HIROKI discloses the driving assistance system according to claim 1, wherein a number of the camera units is at least four, such that environmental images covering a periphery of the vehicle can be obtained when enabling all the camera units (0019; Figures 2A-2B). Regarding claim 18. (Original) HIROKI discloses the driving assistance system according to claim 1, but failed to disclose wherein a field of view of the camera units is greater than or equal to 180 degrees. Kondiparthi, however, in the same field of endeavor, shows the driving assistance system wherein a field of view of the camera units is greater than or equal to 180 degrees (0024; Figure 1; “[0024] The illustrated fields of view subtend approximately 180 degrees. A wide field of view is typically achieved by the camera having a wide field of view lens, such a fisheye lens. A fisheye lens is preferable as these are generally cylindrically symmetric. In other applications of the invention, the field of view may be less or more than 180 degrees. Whilst a fisheye lens is preferred, any other lens that provides a wide field of view can be used. In this context, a wide field of view is a lens having a field of view over 100 degrees, preferably over 150 degrees and more preferably over 170 degrees. Typically, cameras with such a wide field of view result in imaging artefacts and distortions in acquired images.”). It would have been obvious to the person of having ordinary skilled in the art before the effective filing date of the invention to combine fish-eye camera of Kondiparthi with vehicle control device of HIROKI in order to capture a wide angle image that could be used for panoramic or wide view. Regarding claim 19. (Previously Presented) Vehicle claim 19 is drawn to the vehicle corresponding to the system of using same as claimed in claim 1. Therefore, vehicle claim 19 corresponds to system claim 1 and is rejected for the same reasons of obviousness as used above. Claim Rejections - 35 USC § 103 Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over HIROKI in view of Kondiparthi as applied to claim 1 above, and further in view of Millinger, III (US 20170371353 A1, hereinafter “Millinger”). Regarding claim 13. (Original) HIROKI in view of Kondiparthi failed to show the driving assistance system according to claim 12, wherein the inertial sensor comprises a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer. Millinger, however, in the same field of endeavor, shows the inertial sensor comprises a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer (0058; Figures 1-3). It would have been obvious to the person of having ordinary skilled in the art before the effective filing date of the invention to combine Millinger’s inertial sensor comprises a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer with driving assistance system of HIROKI in view of Kondiparthi in order to the detect the vehicles attitude effectively and efficiently and yield predictive result. Response to Arguments Applicant’s arguments with respect to claims 1-19 have been considered but are moot based on the new ground of rejection. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASMAMAW TARKO whose telephone number is (571)272-9205. The examiner can normally be reached Monday -Friday 9:00AM-5:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Kelley can be reached at (571) 272-7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASMAMAW G TARKO/ Patent Examiner, Art Unit 2482
Read full office action

Prosecution Timeline

Jan 15, 2024
Application Filed
Jul 23, 2025
Non-Final Rejection — §103
Dec 12, 2025
Response Filed
Mar 06, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12529288
SYSTEMS AND METHODS FOR ESTIMATING RIG STATE USING COMPUTER VISION
2y 5m to grant Granted Jan 20, 2026
Patent 12511768
METHOD AND APPARATUS FOR DEPTH IMAGE ENHANCEMENT
2y 5m to grant Granted Dec 30, 2025
Patent 12506865
SYSTEMS AND METHODS FOR REDUCING A RECONSTRUCTION ERROR IN VIDEO CODING BASED ON A CROSS-COMPONENT CORRELATION
2y 5m to grant Granted Dec 23, 2025
Patent 12498482
CAMERA APPARATUS
2y 5m to grant Granted Dec 16, 2025
Patent 12469164
VEHICLE EXTERNAL DETECTION DEVICE
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
81%
With Interview (+9.3%)
3y 0m
Median Time to Grant
Moderate
PTA Risk
Based on 395 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month