Prosecution Insights
Last updated: April 19, 2026
Application No. 18/874,148

DRIVING ASSISTANCE DEVICE, DRIVING ASSISTANCE METHOD, AND RECORDING MEDIUM

Non-Final OA §102§103
Filed
Dec 12, 2024
Examiner
POINT, RUFUS C
Art Unit
2689
Tech Center
2600 — Communications
Assignee
NEC Corporation
OA Round
1 (Non-Final)
74%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
92%
With Interview

Examiner Intelligence

Grants 74% — above average
74%
Career Allow Rate
522 granted / 707 resolved
+11.8% vs TC avg
Strong +19% interview lift
Without
With
+18.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
28 currently pending
Career history
735
Total Applications
across all art units

Statute-Specific Performance

§101
3.9%
-36.1% vs TC avg
§103
62.7%
+22.7% vs TC avg
§102
19.7%
-20.3% vs TC avg
§112
9.1%
-30.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 707 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-6 and 8-11 are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Hieida (US 20220169245 A1). Claim 1. Hieida teaches a driving assistance device (Figs. 1 and 2) comprising: at least one memory configured to store instructions ([0083] The storage unit 111 stores various types of programs [0114] storage unit 205/206) ; and at least one processor ([0113] information processing system 200) configured to execute the instructions to: acquire a travel information about a vehicle ([0088] The vehicle state detection unit 143... target include a speed, acceleration, a steering angle, presence or absence of and contents of abnormality, a state of a driving operation [0070] [0116] the data acquisition unit 102 includes various types of sensors each for detecting information associated with the outside of the own vehicle...image input unit 201); detect a dangerous driving of the vehicle based on sidewalk information related that pertains to a sidewalk, the sidewalk information and that is included in the travel information ([0086] the exterior information detection unit 141 performs a detection process for detecting an object around the own vehicle, a recognition process for recognizing the object, a tracking process for tracking the object, and a detection process for detecting a distance to the object...a human, an obstacle... [0103] The emergency avoidance unit 171 performs a detection process for detecting an emergency such as a collision, a contact, entrance into a dangerous zone, abnormality of the driver, and abnormality of the vehicle on the basis of detection results obtained by the exterior information detection unit 141 [0111] identification of a region of a pedestrian in an image of the front of the own vehicle captured by an in-vehicle camera...detailed information indicating a region in ground contact with the pedestrian (e.g., sidewalk or driveway), and also a region with which the pedestrian is likely to come into ground contact. [0135]-[0137] the measuring unit 208 measures a steering angle and a vehicle speed of the own vehicle. Thereafter, the danger level determination unit 209 predicts a future reaching range of the own vehicle…a vehicle speed, searches for an intersection of the predicted future reaching range and the estimated moving range 601 (described above) of the pedestrian A, and determines that there is a danger of a collision between the pedestrian A and the own vehicle in a case where an intersection has been detected. (e.g. information from the detected pedestrian on the sidewalk, emergency avoidance and vehicle measurements are all gathered to detect a dangerous situation)) ; and present the dangerous driving ([0077] The output control unit 105 controls output of various types of information to the person on board of the own vehicle or to the outside of the vehicle. For example, the output control unit 105 generates an output signal containing at least either visual information (e.g., image data) or auditory information (e.g., audio data)... containing a warning sound, a warning message, or the like for a danger such as a collision, a contact, and entrance into a dangerous zone, ). Claim 2. Hieida teaches driving assistance device according to claim 1, wherein the at least one processor is further configured to execute the instructions to: acquire position information of the vehicle ([0071] the data acquisition unit 102... detecting a current position of the own vehicle. [0089] self-position estimation unit 132), and detect the dangerous driving based on the position information and coordinate information of the sidewalk ([0120][0123]... a position of each of the objects is represented as position information in an x-y coordinate system of a world coordinate system... information associated with the moving track is represented as position information associated with each of the objects for each predetermined interval (time interval or distance interval)......the object moving range estimation unit 207 ). Claim 3. Hieida teaches driving assistance device according to claim 1, wherein the at least one processor is further configured to execute the instructions to: detect the dangerous driving based on position information of a pedestrian ([0120][0123][0125] extracted by the tracking unit 203... a position of each of the objects is represented as position information in an x-y coordinate system of a world coordinate system...the object moving range estimation unit 207... danger level determination unit 209 determines a danger level of a collision with the own vehicle for each of the objects on the basis of a comparison result between the moving range of each of the objects estimated by the object moving range estimation unit 207 and the vehicle control information associated with the own vehicle such as a steering angle and a vehicle speed.) Claim 4. Hieida teaches the driving assistance device according to claim 3, wherein the position information of the pedestrian is acquired by a mobile terminal ([0115] the constituent elements of the information processing system 200 may also be implemented using an information terminal carried into the vehicle interior by the person on board, such as a smartphone and a tablet, ). Claim 5. Hieida teaches the driving assistance device according to claim 2, wherein the at least one processor is further configured to execute the instructions to: detect the dangerous driving based on whether a roadway and the sidewalk are distinguished from each other ([0181][0184] it is predicated that the pedestrian A will transit on the ground contact surface in an order of the driveway, the driveway, the driveway, the driveway, the sidewalk, the sidewalk, and others in the future. ). Claim 6. Hieida teaches the driving assistance device according to claim 3, wherein the at least one processor is further configured to execute the instructions to: detect the dangerous driving in a case where a distance between the vehicle and the pedestrian exceeds a threshold and the vehicle suddenly brakes ([0261] The emergency avoidance unit 171 plans an action of the own vehicle for avoiding a collision with the object determined to possibly collide with the own vehicle, and supplies data indicating the planned action of the own vehicle to the acceleration/deceleration control unit 172 and the direction control unit 173 to achieve the damage reduction brake function. [0300] Then, the danger level determination unit 2513 compares a length of the time required for the own vehicle to reach the intersection with a predetermined threshold (step S3519). [0301] In a case where the time required for the own vehicle to reach the intersection is equal to or shorter than the threshold (Yes in step S3519), the danger level determination unit 2513 determines that there is a danger of a collision between the own vehicle and the object.(e.g. exceeding a short threshold)). Claim 8. Hieida teaches the driving assistance device according to claim 1, further comprising: wherein the at least one processor is further configured to execute the instructions to: warn a driver of the vehicle of the dangerous driving ([0113] Drive assistance such as warning to the driver, [0126] the vehicle control system 100 may also be configured such that the output control unit 105 gives warning, such as output of audio data containing a warning sound, a warning message, or the like from the output unit 106, as well as the damage reduction brake function.). Claim 9. Hieida teaches the driving assistance device according to claim 8, wherein the at least one processor is further configured to execute the instructions to: warn the driver near an occurrence position stored in a storage device ([0121] The contact region time-series information storage unit 206 stores, for each of the objects, time-series information associated with the contact region of each of the objects and determined by the contact region determination unit 204. [0125] danger level determination unit 209 determines a danger level of a collision with the own vehicle for each of the objects on the basis of a comparison result between the moving range of each of the objects estimated by the object moving range estimation unit 207 and the vehicle control information associated with the own vehicle such as a steering angle and a vehicle speed.). Claim 10. Hieida teaches a driving assistance method comprising: acquiring travel information of a vehicle ([0088] The vehicle state detection unit 143... target include a speed, acceleration, a steering angle, presence or absence of and contents of abnormality, a state of a driving operation [0070] [0116] the data acquisition unit 102 includes various types of sensors each for detecting information associated with the outside of the own vehicle...image input unit 201); detecting dangerous driving of the vehicle based on sidewalk information that pertains to a sidewalk included in the travel information ([0086] the exterior information detection unit 141 performs a detection process for detecting an object around the own vehicle, a recognition process for recognizing the object, a tracking process for tracking the object, and a detection process for detecting a distance to the object...a human, an obstacle... [0103] The emergency avoidance unit 171 performs a detection process for detecting an emergency such as a collision, a contact, entrance into a dangerous zone, abnormality of the driver, and abnormality of the vehicle on the basis of detection results obtained by the exterior information detection unit 141 [0111] identification of a region of a pedestrian in an image of the front of the own vehicle captured by an in-vehicle camera, and further allows acquisition of detailed information indicating a region in ground contact with the pedestrian (e.g., sidewalk or driveway), and also a region with which the pedestrian is likely to come into ground contact. [0135]-[0137] the measuring unit 208 measures a steering angle and a vehicle speed of the own vehicle. Thereafter, the danger level determination unit 209 predicts a future reaching range of the own vehicle…a vehicle speed, searches for an intersection of the predicted future reaching range and the estimated moving range 601 (described above) of the pedestrian A, and determines that there is a danger of a collision between the pedestrian A and the own vehicle in a case where an intersection has been detected. (e.g. information from the emergency avoidance unit, detected pedestrian and vehicle measurements are gathered to detect a dangerous situation)); and presenting the dangerous driving in a case where the dangerous driving has been detected ([0077] The output control unit 105 controls output of various types of information to the person on board of the own vehicle or to the outside of the vehicle. For example, the output control unit 105 generates an output signal containing at least either visual information (e.g., image data) or auditory information (e.g., audio data)... containing a warning sound, a warning message, or the like for a danger such as a collision, a contact, and entrance into a dangerous zone, ). Claim 11. Hieida teaches a non-transitory recording medium having stored therein a program causing a computer to execute: acquiring travel information of a vehicle from the vehicle ([0088] The vehicle state detection unit 143... target include a speed, acceleration, a steering angle, presence or absence of and contents of abnormality, a state of a driving operation [0070] [0116] the data acquisition unit 102 includes various types of sensors each for detecting information associated with the outside of the own vehicle...image input unit 201); detecting dangerous driving of the vehicle based on sidewalk information that pertains to a sidewalk included in the travel information ([0086] the exterior information detection unit 141 performs a detection process for detecting an object around the own vehicle, a recognition process for recognizing the object, a tracking process for tracking the object, and a detection process for detecting a distance to the object...a human, an obstacle... [0103] The emergency avoidance unit 171 performs a detection process for detecting an emergency such as a collision, a contact, entrance into a dangerous zone, abnormality of the driver, and abnormality of the vehicle on the basis of detection results obtained by the exterior information detection unit 141 [0111] identification of a region of a pedestrian in an image of the front of the own vehicle captured by an in-vehicle camera, and further allows acquisition of detailed information indicating a region in ground contact with the pedestrian (e.g., sidewalk or driveway), and also a region with which the pedestrian is likely to come into ground contact. [0135]-[0137] the measuring unit 208 measures a steering angle and a vehicle speed of the own vehicle. Thereafter, the danger level determination unit 209 predicts a future reaching range of the own vehicle…a vehicle speed, searches for an intersection of the predicted future reaching range and the estimated moving range 601 (described above) of the pedestrian A, and determines that there is a danger of a collision between the pedestrian A and the own vehicle in a case where an intersection has been detected. (e.g. information from the emergency avoidance unit, detected pedestrian and vehicle measurements are gathered to detect a dangerous situation)); and presenting the dangerous driving to a predetermined output destination in a case where the dangerous driving has been detected ([0077] The output control unit 105 controls output of various types of information to the person on board of the own vehicle or to the outside of the vehicle. For example, the output control unit 105 generates an output signal containing at least either visual information (e.g., image data) or auditory information (e.g., audio data)... containing a warning sound, a warning message, or the like for a danger such as a collision, a contact, and entrance into a dangerous zone, ). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Hieida in view of Sisbot (US 20160063761 A1). Claim 7. Hieida teaches the driving assistance device according to claim 1, and further discloses the use of a camera to capture images over time but does not specifically disclose wherein the at least one processor is further configured to execute the instructions to: capture a traveling video of the vehicle, and compare the traveling video with the sidewalk information and detects the dangerous driving. However, Sisbot teaches wherein the at least one processor is further configured to execute the instructions to: capture a traveling video of the vehicle, and compare the traveling video with the sidewalk information and detects the dangerous driving ([0060] In another example, the detection module 222 receives images or video from the camera 233 and identifies the location of objects, such as pedestrians or stationary objects including buildings, lane markers, obstacles, etc. [0071] the danger assessment module 226 determines a vehicle path for the first client device 103 based on the object data 297 and compares the vehicle path to an object path to determine whether there is a likelihood of collision between the first client device 103 and the object.). Therefore, it would have been obvious to one ordinarily skilled in the art before the effective filing date of invention to use traveling video of the vehicle, and compare the traveling video as taught by Sisbot within the system of Hieida for the purpose of enhancing the system to obtain a stream of images within a period of time in order to fine tune real time detection of a pedestrian as the vehicle travels. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to RUFUS C POINT whose telephone number is (571)270-7510. The examiner can normally be reached 9am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Davetta Goins can be reached at 571-272-2957. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RUFUS C POINT/Primary Examiner, Art Unit 2689
Read full office action

Prosecution Timeline

Dec 12, 2024
Application Filed
Feb 06, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602957
SERVER, SYSTEM, AND METHOD FOR PROVIDING ENERGY EFFICIENCY MANAGEMENT SERVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12589765
SYSTEMS AND METHODS FOR ENHANCING OPERATOR VIGILANCE
2y 5m to grant Granted Mar 31, 2026
Patent 12592107
SPOKEN NOTIFICATIONS FOR ACOUSTIC VEHICLE ALERTING SYSTEMS
2y 5m to grant Granted Mar 31, 2026
Patent 12573297
METHOD FOR PROVIDING BUS INFORMATION IN REAL TIME, AND SYSTEM AND APPLICATION IMPLEMENTING THE METHOD
2y 5m to grant Granted Mar 10, 2026
Patent 12566936
GENERATING A MEDIA-BASED UNIQUE OBJECT IDENTIFIER
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
74%
Grant Probability
92%
With Interview (+18.7%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 707 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month