Prosecution Insights
Last updated: April 19, 2026
Application No. 18/847,459

DRIVING ASSISTANCE DEVICE

Non-Final OA §103
Filed
Sep 16, 2024
Examiner
JHA, ABDHESH K
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Hitachi Astemo, Ltd.
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
328 granted / 408 resolved
+28.4% vs TC avg
Strong +18% interview lift
Without
With
+18.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
24 currently pending
Career history
432
Total Applications
across all art units

Statute-Specific Performance

§101
10.0%
-30.0% vs TC avg
§103
47.2%
+7.2% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 408 resolved cases

Office Action

§103
DETAILED ACTION Claims 1-15 are considered in this office action. Claims 1-15 are pending examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed. The applicant is encouraged to add some descriptive language focusing the inventive concept of the pending invention. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: A driving operation acquisition unit, a first observation information acquisition unit, a second observation information acquisition unit, a front observation unit, a rear observation unit, an obstacle determination unit, a storage unit, a surrounding situation collation unit, a reverse playback backward movement assistance control unit in claims 1-4, 9 A follow-up vehicle detection unit, a follow-up vehicle following control unit in claim 5. A passing space determination unit, a rear side movement assistance control unit in claim 6, 9. A moving object movement determination unit, a reroute generation travel assistance control unit claim 10 and 11. An other-vehicle communication checking unit, a following control transmission/reception unit in claim 12. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. All these claim limitations listed above is being interpreted as disclosed in Figure.1 and corresponding Para [0018]. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-4, 6-9 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Fukuyasu (JP2008302711A) in view of Kojo et al. (JP2012173843A) and herein after will be referred as Fukuyasu and Kojo. Regarding Claim 2, Fukuyasu teaches a driving assistance device (Para [0009]: “A start assist device for solving the above-described problems includes a parking locus acquisition means for acquiring a parking locus from the parking preparation position to the parking space when the vehicle is parked from the parking preparation position to the parking space, and a parking locus.”) comprising: a driving operation acquisition unit that acquires a driving operation of a driver of a host vehicle (Para [0081]: “When the sensor is normal (S82: No), the current position information of the vehicle 101 is acquired from the position detector 1 (S83), and the vehicle 101 is the vehicle surrounding information acquisition position at the start, that is, the previous vehicle surrounding information acquisition position at the time of parking. (S84: Yes), the attitude (orientation) of the vehicle 101 is calculated based on the data acquired from the gyroscope 3 (S85). Subsequently, an image around the vehicle 101 is taken by the in-vehicle camera group 32, and an object around the vehicle 101 is detected by the radar group 34 (S86).”); a front observation information acquisition unit that acquires front observation information of a front observation unit that observes a surrounding situation in front of the host vehicle; a rear observation information acquisition unit that acquires rear observation information of a rear observation unit that observes a surrounding situation behind the host vehicle (Para[0052] : “The in-vehicle camera group 32 captures the surroundings of the vehicle 101, and captures a road image in the immediate vicinity of the vehicle 101 with a plurality of cameras having continuous fields of view along the periphery of the vehicle 101. FIG. 2 shows an example of the mounting, in which eight cameras 32 a to 32 h are provided on the vehicle body of the vehicle 101. Each of the front right camera 32a, the front left camera 32b, the rear right camera 32c, the rear left camera 32d, the right front camera 32e, the right rear camera 32f, the left front camera 32g, and the left rear camera 32h. It is called. The in-vehicle camera group 32 corresponds to the vehicle surrounding information acquisition means when parked, the vehicle surrounding information acquisition means when starting, and the photographing means of the present invention.”); an obstacle determination unit that determines at least a stationary object and other objects among obstacles by using front observation information of the front observation unit and rear observation information of the rear observation unit when the host vehicle moves forward (Para [0082]: “These captured images or detection results (starting vehicle surrounding information) are compared with parking vehicle surrounding information at the same position (S87). For example, when comparing captured images, the similarity between both images is checked, and if there is an object that has not been captured at the time of parking, it is determined whether or not the object is an obstacle. It is also checked whether or not the object photographed during parking has moved.”); a storage unit that stores at least a stationary object among the obstacles determined by the obstacle determination unit and the driving operation acquired by the driving operation acquisition unit (Para [0070-0071]: “Subsequently, the in-vehicle camera group 32 captures an image around the vehicle 101, and the radar group 34 detects an object (such as the vehicle 102 in FIG. 3) around the vehicle 101 (S56). Then, the current position of the vehicle 101 at this time (that is, the acquisition position information of the surrounding vehicle information on the parking locus), the direction of the vehicle 101 detected by the gyroscope 3, the information of each sensor, the captured image, the object detection Information is memorize | stored in the database 21d as vehicle periphery information at the time of parking (S57). FIG. 7 shows a storage example of the vehicle periphery information at the time of parking. The vehicle periphery information at the time of parking is stored in association with the current position of the vehicle 101. Thereafter, when the vehicle surrounding information acquisition condition at the time of parking is satisfied (S58: Yes), the current position of the vehicle 101, the direction of the vehicle 101 detected by the gyroscope 3, the information of each sensor, and the captured image are the same as above. The object detection information is stored in the database 21d as the vehicle periphery information at the time of parking (S59 to S62).”); Fukuyasu may not expressly teaches a surrounding situation collation unit that collates an obstacle behind the host vehicle obtained by the rear observation unit with the stationary object stored by the storage unit when the host vehicle moves backward along a route where the host vehicle has moved forward; and a reverse playback backward movement assistance control unit that assists backward travel by reverse playback of a driving operation stored in the storage unit when the host vehicle moves forward, when the host vehicle moves backward along a route where the host vehicle has moved forward, wherein the reverse playback backward movement assistance control unit determines whether the reverse playback backward movement assistance is performed according to a result of the collation obtained by the surrounding situation collation unit. Kojo teaches a surrounding situation collation unit that collates an obstacle behind the host vehicle obtained by the rear observation unit with the stationary object stored by the storage unit when the host vehicle moves backward along a route where the host vehicle has moved forward (Para [0057]: “Next, processing of the accumulated image selection unit 34 that is a processing block of the driving support ECU 3 and other processing contents related to the processing will be described. The stored image selection unit 34 selects a captured image suitable for matching with the captured image of the camera 2 from the travel history DB 5. First, the feature point detection unit 31 calculates a feature point of the current captured image. The feature point detection unit 31 also detects feature points for each accumulated image in the travel history DB 5. Then, the accumulated image selection unit 34 calculates how many feature points in common with the current captured image exist for each accumulated image. Then, the accumulated image selection unit 34 selects an accumulated image having the most common feature points.”); and a reverse playback backward movement assistance control unit that assists backward travel by reverse playback of a driving operation stored in the storage unit when the host vehicle moves forward, when the host vehicle moves backward along a route where the host vehicle has moved forward, wherein the reverse playback backward movement assistance control unit determines whether the reverse playback backward movement assistance is performed according to a result of the collation obtained by the surrounding situation collation unit (Para [0038] : “In step S123, based on the vehicle position information on the travel route, the direction of the route is determined so that the vehicle is used in the forward direction if it is close to the route start point, and in the reverse direction if it is close to the end point. The direction of the travel route is set so as to become, and the process proceeds to step S124. Note that, based on the relationship between the obstacle and the obstacle specified by the three-dimensional feature point group centered on the accumulated image selected by the accumulated image selection unit 34, the traveling route becomes a more appropriate route. You may correct | amend a path | route. In step S undefined 124, a button indicating whether or not to perform automatic driving is displayed on the display 6, and the driver's input can be received through the touch panel 7. When there is an automatic driving instruction from the driver, the process proceeds to step S125, and when there is no automatic driving instruction, the process proceeds to step S126.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Fukuyasu to incorporate the teachings of Kojo to include a surrounding situation collation unit that collates an obstacle behind the host vehicle obtained by the rear observation unit with the stationary object stored by the storage unit when the host vehicle moves backward along a route where the host vehicle has moved forward; and a reverse playback backward movement assistance control unit that assists backward travel by reverse playback of a driving operation stored in the storage unit when the host vehicle moves forward, when the host vehicle moves backward along a route where the host vehicle has moved forward, wherein the reverse playback backward movement assistance control unit determines whether the reverse playback backward movement assistance is performed according to a result of the collation obtained by the surrounding situation collation unit. Doing so would optimize the vehicle operation. Similarly Claim 1 is rejected on the similar rational. Regarding Claim 3, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 2. Fukuyasu also teaches wherein the front observation unit and the rear observation unit are configured using a sensor having the same characteristics (Para [0052]), the obstacle determination unit determines the same obstacle detected by both the front observation unit and the rear observation unit as the stationary object (Para [0082]: “These captured images or detection results (starting vehicle surrounding information) are compared with parking vehicle surrounding information at the same position (S87). For example, when comparing captured images, the similarity between both images is checked, and if there is an object that has not been captured at the time of parking, it is determined whether or not the object is an obstacle. It is also checked whether or not the object photographed during parking has moved.”), the surrounding situation collation unit determines, as a moving object, an object that is not present in the stationary object stored in the storage unit and is present in the current obstacle behind the host vehicle obtained from the rear observation unit in collation between the stationary object stored in the storage unit and the current obstacle behind the host vehicle obtained from the rear observation unit, and when the moving object is included at the time of collation, the reverse playback backward movement assistance control unit does not perform the reverse playback backward movement assistance (Para [0084]: “On the other hand, when there is no obstacle on the escape route (S88: No) and when the vehicle 101 is not stopped (S100: No), the switch operation information and the sensor information are acquired, and whether or not the automatic start control is canceled Check out. If there is an operation for stopping the automatic start control (S89: Yes), a command is sent to the brake device 200 to operate the brake to stop the vehicle 101, and a chime sound indicating the end of the automatic start control is sent from the speaker 15. Is output, and a voice message such as “Automatic start control is stopped” is transmitted (S97).”). Regarding Claim 4, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 3. Fukuyasu also teaches wherein the surrounding situation collation unit determines that an object that is present in the stationary object stored in the storage unit and is not present in the current obstacle behind the host vehicle obtained from the rear observation unit is highly likely to be a failure of the sensor in the collation between the 62stationary object stored in the storage unit and the current obstacle behind the host vehicle obtained from the rear observation unit, andwhen it is determined that there is a high possibility of a failure in the sensor at the time of collation, the reverse playback backward movement assistance control unit does not perform the reverse playback backward movement assistance and alerts the driver (Para [0080-0087]). Regarding Claim 6, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 3. Fukuyasu also teaches further comprising: a passing space determination unit that detects a passing space on a rear side of the host vehicle where the host vehicle can pass as a result of collation by the surrounding situation collation unit; and a rear side movement assistance control unit that moves to the passing space on the rear side of the host vehicle detected by the passing space determination unit to assist the passing (Para [0078]). Regarding Claim 7, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 2. Fukuyasu also teaches wherein the front observation unit includes a sensor capable of determining a type of an object based on an image, the obstacle determination unit determines the stationary object from a detection result of the rear observation unit based on type information of the object determined by the front observation unit, the surrounding situation collation unit determines, as a moving object, an object that is not present in the stationary object stored in the storage unit and is present in the current obstacle behind the host vehicle obtained from the rear observation unit in collation between the stationary object stored in the storage unit and the current obstacle behind the host vehicle obtained from the rear observation unit, and when the moving object is included at the time of collation, the reverse playback backward movement assistance control unit does not perform the reverse playback backward movement assistance (Para [0082-0084]). Regarding Claim 8, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 7. Fukuyasu also teaches wherein the surrounding situation collation unit determines that an object that is present in the stationary object stored in the storage unit and is not present in the current obstacle behind the host vehicle obtained from the rear observation unit is highly likely to be a failure of the sensor in the collation between the stationary object stored in the storage unit and the current obstacle behind the host vehicle obtained from the rear observation unit, and when it is determined that there is a high possibility of the failure in the sensor at the time of the collation, the reverse playback backward movement assistance control unit does not 64perform the reverse playback backward movement assistance and alerts the driver (Para [0082]). Regarding Claim 9, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 7. Fukuyasu also teaches wherein the obstacle determination unit detects left and right stopped vehicles or a construction area, the storage unit stores the left and right stopped vehicles or the construction area detected by the obstacle determination unit, and the driving assistance device further includes a passing space determination unit that detects a passing space on a rear side of the host vehicle where the host vehicle can pass as a result of collation by the surrounding situation collation unit, and a rear side movement assistance control unit that moves to the passing space on the rear side of the host vehicle detected by the passing space determination unit to assist the passing (Para [0083]). Regarding Claim 14, Fukuyasu in view of Kojo teaches the driving assistance device according to claim 6. Fukuyasu also teaches wherein the passing space determination unit determines, as the passing space, a space in which an object having a predetermined depth or more behind the host vehicle is present among objects present in the stationary object stored in the storage unit and not present in the current obstacle behind the host vehicle obtained from the rear observation unit in the collation between the stationary object stored in the storage unit and the current obstacle behind the host vehicle obtained from the rear observation unit (Para [0082-0083]). Allowable Subject Matter Claims 5, 10 and 11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Kim (US10370031B2) discloses an assistant system and assistant method for a backward driving of a vehicle, wherein the assistant system includes an input unit configured to receive a backward driving assistant command, a position tracking unit configured to track real-time positions of the vehicle, an obstacle sensing unit configured to generate an obstacle sensing signal, a steering device manipulation unit configured to adjust a steering angle of a steering device of the vehicle, a primary backward path generation unit configured to generate a primary backward path, a secondary backward path generation unit configured to generate a secondary backward path for the vehicle to drive backward, and a control unit configured to match the real-time positions of the vehicle to the secondary backward path, and at the time of a backward driving, to control the steering device manipulation unit. Gunzel et al. (US11891064B2) discloses operating a backup assistance system for a vehicle, wherein the backup assistance system enables travel in reverse along a previously travelled trajectory, wherein a maximum distance for the reverse travel along the previously travelled trajectory is determined and output using a control unit on the basis of an expected error for the reverse travel along the previously travelled trajectory, in which the expected error exceeds a predefined error threshold value. Suzuki et al. (US9830826B) teaches a driving assistance apparatus includes an arrangement memory that stores a past captured image around a target parking region, a captured image acquisition section that acquires a present captured image around the target parking region at parking or departing with respect to the target parking region, and an obstacle specification section that specifies a non-stationary obstacle around the target parking region, based on a difference between the past captured image stored in the arrangement memory, and the present captured image acquired by the arrangement acquisition section. A report section performs a report indicating presence of a non-stationary object when approaching closely the non-stationary object. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ABDHESH K JHA whose telephone number is (571)272-6218. The examiner can normally be reached M-F:0800-1700. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached at 571-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ABDHESH K JHA/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Sep 16, 2024
Application Filed
Jan 23, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602959
VEHICLE STORAGE MANAGEMENT SYSTEM, STORAGE MEDIUM, AND STORAGE MANAGEMENT METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12592100
VEHICLE-BASED DATA OPTIMIZATION
2y 5m to grant Granted Mar 31, 2026
Patent 12572156
SYSTEMS AND METHODS FOR LANDING SITE SELECTION AND FLIGHT PATH PLANNING FOR AN AIRCRAFT USING SOARING WEATHER
2y 5m to grant Granted Mar 10, 2026
Patent 12573250
Used car AI performance inspection system based on acoustic data analysis, and processing method therefor
2y 5m to grant Granted Mar 10, 2026
Patent 12555419
METHOD FOR REAL-TIME ECU CRASH REPORTING AND RECOVERY
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+18.3%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 408 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month