Prosecution Insights
Last updated: April 19, 2026
Application No. 18/906,282

AUTONOMOUS DRIVING ASSISTANCE DEVICE AND AUTONOMOUS DRIVING ASSISTANCE SYSTEM

Non-Final OA §101§103
Filed
Oct 04, 2024
Examiner
PINKERTON, ROBERT LOUIS
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Mitsubishi Electric Corporation
OA Round
1 (Non-Final)
86%
Grant Probability
Favorable
1-2
OA Rounds
2y 8m
To Grant
99%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
60 granted / 70 resolved
+33.7% vs TC avg
Strong +22% interview lift
Without
With
+22.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
6 currently pending
Career history
76
Total Applications
across all art units

Statute-Specific Performance

§101
16.5%
-23.5% vs TC avg
§103
52.4%
+12.4% vs TC avg
§102
22.7%
-17.3% vs TC avg
§112
7.8%
-32.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 70 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Examiner acknowledges Applicant’s claim for priority to Japanese Patent Application No. JP2023-193600 filed under 35 U.S.C. 119 and receipt of the priority document filed on 11/14/2023. Information Disclosure Statement The information disclosure statement(s) (IDS)(s) submitted on 10/04/2024 has/have been received, considered, and is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the IDS(s) has/have been considered by the Examiner. Specification The following guidelines illustrate the preferred layout and content for the specification of a utility application. These guidelines are suggested for the applicant’s use. Content of Specification (a) TITLE OF THE INVENTION: See 37 CFR 1.72(a) and MPEP § 606. The title of the invention should be placed at the top of the first page of the specification unless the title is provided in an application data sheet. The title of the invention should be brief but technically accurate and descriptive, preferably from two to seven words. It may not contain more than 500 characters. (b) CROSS-REFERENCES TO RELATED APPLICATIONS: See 37 CFR 1.78 and MPEP § 211 et seq. (c) STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT: See MPEP § 310. (d) THE NAMES OF THE PARTIES TO A JOINT RESEARCH AGREEMENT. See 37 CFR 1.71(g). The disclosure is objected to because of the following informalities: The specification is objected to because it is missing (b) CROSS-REFERENCES TO RELATED APPLICATIONS; See 37 CFR 1.78 and MPEP § 211 et seq. The lengthy specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Appropriate correction is required. Claim Objections Claim(s) 1-8 and 17-20 is/are objected to because of the following informalities: The word “circuitry” is misspelled as “circuity”. See the definition from Merriam Webster provided immediately below: For examination purposes the limitation has been considered as reciting and amending the claim to recite "circuitry" would appear to overcome this objection. PNG media_image1.png 165 764 media_image1.png Greyscale Appropriate correction is required. Claim Rejections – 35 U.S.C. § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. On January 7, 2019, the USPTO released new examination guidelines for determining whether a claim is directed to non-statutory subject matter. According to the guidelines, a claim is directed to non-statutory subject matter if: (a) it does not fall within one of the four statutory categories of invention or (b) or meets a three-prong test for determining that: (1) the claim recites a judicial exception, e.g. an abstract idea, (2) without integration into a practical application and (3) does not recite additional elements that provide significantly more than the recited judicial exception. Claim(s) 1 is/are directed toward a system and computer. Therefore, it can be seen that claim(s) 1 falls within one of the four statutory categories of invention. However, the claim clearly does not meet the three-prong test for patentability. With regard to the first prong, does the claim recite a judicial exception, the guidelines provide three groupings of subject matter that are considered abstract ideas: (a) Mathematical concepts - mathematical relationships, mathematical formulas or equations, mathematical calculations; (b) Certain methods of organizing human activity - fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions); and (c) Mental processes - concepts performed in the human mind (including an observation, evaluation, judgment, opinion). Applicant's claim(s) 1 is/are directed toward the abstract idea of receiving, determining and/or obtaining sensor data, comparing and/or matching the sensor data, and outputting an output circuity based on the comparing and/or matching of the sensor data, which comprises mathematical concepts, to data and deriving a result based on the application. Thus, it can be seen that the claim is/are directed towards an abstract idea. With regard to the second prong, whether the abstract idea is integrated into a practical application, the guidelines provide the following exemplary considerations that are indicative that an additional element (or combination of elements) may have integrated the judicial exception into a practical application: an additional element reflects an improvement in the functioning of a computer, or an improvement to other technology or technical field; an additional element that applies or uses a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition; an additional element implements a judicial exception with, or uses a judicial exception in conjunction with, a particular machine or manufacture that is integral to the claim; an additional element effects a transformation or reduction of a particular article to a different state or thing; and an additional element applies or uses the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception. It is clear that Applicant's claim(s) do/does not comprise any of the above additional elements that, individually or in combination, have integrated the judicial exception into a practical application. There is no improvement in the functioning of a computer. Nor are the limitations implemented in particular machine or manufacture. There is no transformation or reduction of a particular article to a different state or thing. There are no additional elements that apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment. Claim(s) 1 recite(s) one additional element: a computer comprising at least one processor. The computer is recited at a high level of generality, i.e., as a generic processor performing a generic computer function of processing data. This generic processor limitation is no more than mere instructions to apply the exception using a generic computer component. Accordingly, this additional element does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Notably, there is no actual use or presentation of the motion plans, such as controlling the vehicle. While the guidelines further state that the exemplary considerations are not an exhaustive list and that there may be other examples of integrating the exception into a practical application, the guidelines also list examples in which a judicial exception has not been integrated into a practical application: an additional element merely recites the words "apply it" (or an equivalent) with the judicial exception, or merely includes instructions to implement an abstract idea on a computer, or merely uses a computer as a tool to perform an abstract idea; an additional element adds insignificant extra-solution activity to the judicial exception; and an additional element does no more than generally link the use of a judicial exception to a particular technological environment or field of use. Since the abstract idea in Applicant's claim(s) 1 is/are implemented on a computer and there are no further limitations or structural elements that go beyond the computer, it can clearly be seen that the abstract idea of receiving, determining and/or obtaining sensor data, comparing and/or matching of the sensor data, and outputting an output circuity based on the comparing and/or matching of the sensor data, which comprises mathematical concepts, to data and deriving a result based on the application is merely implemented on a computer. Thus there is no integration of the abstract idea into a practical application. With regard to the third prong, whether the claims recite additional elements that provide significantly more than the recited judicial exception, the guidelines specify that the pre­guideline procedure is still in effect. Specifically, that examiners should continue to consider whether an additional element or combination of elements: adds a specific limitation or combination of limitations that are not well­understood, routine, conventional activity in the field, which is indicative that an inventive concept may be present; or simply appends well-understood, routine, conventional activities previously known to the industry, specified at a high level of generality, to the judicial exception, which is indicative that an inventive concept may not be present. Applicant's claim(s) do/does not recite additional elements that provide significantly more than the recited judicial exception. The use of one or more computers to implement mathematical operations is a well-understood, routine and conventional activity. Thus, since claim(s) 1 is/are: (a) directed toward an abstract idea, (b) not integrated into a practical application and (c) do not comprise significantly more than the recited abstract idea, they are directed toward non-statutory subject matter and rejected under 35 USC 101. Claims 2-20 is/are also rejected as depending on claim(s) 1, respectively. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. 20230382389 A1 to Mochizuki in view of U.S. 9631941 B2 to Sugimoto et al. (Sugimoto), in further view of U.S. 20230174097 A1 to Kakuta. Regarding claim 1, Mochizuki discloses an autonomous driving assistance device for determining whether or not a specific area is usable, the autonomous driving assistance device comprising: an object detector to detect an object from sensor data acquired by a sensor device which monitors the specific area (Mochizuki discloses a group of cameras an object detector from image data from a group of cameras 16 (see Fig. 1; [0029] (A group of cameras (a front camera 16A, a left front camera 16B, a left rear camera 16C, a right front camera 16D, a right rear camera 16E, and a rear camera 16F) that capture images of surroundings of the vehicle (collectively, “group of cameras 16”) are connected to the vehicle control device 14…a group of radars 18 including a plurality of millimeter-wave radars and LIDAR is connected to the vehicle control device 14); [0030] (autonomous driving control device 20 determines the driving operations to reach the destination based on information necessary for autonomous driving from the vehicle control device 14…and instructs the vehicle control device 14 to perform the driving operations); [0031] (autonomous driving control device 20 is communicable with the driving assistance control device 12 via the wireless communication device 22A of the network 22))); a position calculation circuity to calculate a position, in the real world, of the object detected by the object detector (Mochizuki discloses a position of an object(s) detected by the object detector ([0037] (locations of an object 24 for recognition based on images captured before and after a predefined time interval are compared, and a numerical value (score) based on an overlapping area of the object 24 for recognition is calculated…the score is an area); [0038] (the object 24 for recognition may be distinguished into any of the following objects 24 for recognition: [0039] (1) an object 24 for recognition that is stationary; [0040] (2) an object 24 for recognition moving at a relatively low speed; and [0041] (3) an object 24 for recognition moving at a relatively high speed); [0100] (autonomous driving is performed based on real-time information applied to this high-precision map…moving states of the same object 24 for recognition before and after the detection time interval indicated from the driving assistance control device 12 side are calculated); see Figs. 6A-B; [0101] (location information depending on the change in time is acquired for a parked vehicle 24A, a traveling vehicle 24B, and a person 24C crossing a road as objects 24 for recognition in the vicinity of the vehicle 10 during autonomous driving); [0102] (the detection time interval is 0.1 seconds...the area of overlap between the occupied areas of the parked vehicle 24A before and after the detection time interval is equal to the occupied area of the parked vehicle 24A))); a map information acquisition circuity to acquire a position, in the real world, of the specific area (Mochizuki discloses score calculation unit as acquisition circuitry to acquire a position within an area in the real world via score calculation unit 39 (see Figs. 4A-B; [0054] (The time interval acquisition unit 38 is connected to a score calculation unit 39, and transmits information about the acquired time interval to the score calculation unit 39); [0055] (The score calculation unit 39 is connected to the driving control unit 32, and acquires information about the real-time high-precision map at acquired predefined time intervals); [0056] (The score calculation unit 39 analyses this high-precision map and calculates, for each of the objects 24 for recognition, an area of overlap between the occupied areas of the same object 24 for recognition acquired before and after the predefined time interval); [0065] (The assistance control unit 52 transmits the acquired information to the user interface 40))); and an output circuity to output a result, of the determination performed by the determination circuity, indicating whether or not the specific area is usable (Mochizuki discloses a user interface 40 including monitor 40B as an output device to output the result(s) of the determination(s) (see Fig. 2; [0066] (user interface 40 is equipped with…a monitor 40B as an output device); [0067] (real-time on-site images…are displayed on the monitor 40B, which are viewed by the operator OP); [0068] (Based on the images displayed on the monitor 40B, the operator OP determines what kind of assistance is to be provided and inputs assistance instruction information through the input device 40A); [0091] (operator OP…visually manages the monitor 40B of the user interface 40 of the driving assistance control device 12, recognizing information about the recognized object 24 for recognition, the operator OP inputs avoidance information using the input device 40A to avoid a situation (event) that interferes with autonomous driving); [0097] (driving assistance control device 12 indicates to the vehicle 10 a time interval that depends on movement of an object 24 for recognition…which allows the vehicle to acquire the necessary and sufficient data for information…on the monitor 40B))). However, Mochizuki does not appear to further expressly disclose: a determination circuity to determine, by using the position of the object calculated by the position calculation circuity and the position of the specific area acquired from the map information acquisition circuity, whether or not the specific area is usable. Sugimoto, in the same field of endeavor, further discloses: a determination circuity to determine, by using the position of the object calculated by the position calculation circuity and the position of the specific area acquired from the map information acquisition circuity, whether or not the specific area is usable (Sugimoto discloses determination circuitry to calculate whether or not a specific area is usable or not based on if it is occupied by an object such as another vehicle or pedestrian, etc. and level of occupation of the area (¶ (44) (image analyzing portion 8 generates information related to parking availability at a parking area at an imaging location...an image of a parking area as viewed from outside the parking area is captured by the vehicle C.sub.A…in response to a command from the image execution instructing portion 4, and the communication portion 9 obtains the image of outside of the parking area captured by the vehicle C.sub.A…image analyzing portion 8 detects the number of vehicles lined up outside the parking area based on the image of outside the parking area, and determines how crowded the parking area is); ¶ (47) (One example…for determining whether there is a chance that a parking space will become available…involves the image analyzing portion 8 detecting the number of vehicles that are moving and the number of vehicles that are not moving, and determining that there is a chance that a parking space will become available if the number of vehicles that are moving is equal to or greater than the number of vehicles that are not moving); ¶ (48) (information related to parking availability in a parking area at the imaging location of the vehicle C.sub.A or the vehicle C.sub.B is generated based on the image information obtained by the image analyzing portion 8, and…communication portion 9…information related to parking availability in a parking area that changes dynamically is able to be provided in real time))). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the information collection control device of Mochizuki to incorporate the information providing system of Sugimoto to include image analyzing portion(s) 8 and communication portion(s) 9 to generate and obtain image and environment information within and outside a parking area in order to calculate whether or not a specific area is usable based on the occupation and density of the disclosed area, with predictable results, with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to combine Mochizuki and Sugimoto for the express benefit of including image analyzing and communication portions of the autonomous vehicle to determine if the specified area is occupied or available, as explained in Sugimoto ¶ (44) and (47)-(48). However, Sugimoto does not appear to further expressly disclose: an output circuity to output a result, of the determination performed by the determination circuity, indicating whether or not the specific area is usable. Kakuta, in the same field of endeavor, further discloses: an output circuity to output a result, of the determination performed by the determination circuity, indicating whether or not the specific area is usable (Kakuta discloses roadside sensor device 100 including a position output unit 140 to output position information and detection information of a route(s) and obstacle(s) along the route(s) and object vehicle surrounding obstacle output unit 324 to identify other vehicles ([0033] (roadside sensor device 100 includes a…roadside sensor device mounted position output unit 140 which has position information where the roadside sensor device 100 is mounted and which outputs the position information, a roadside sensor fusion unit 150 for calculating a detection result for the obstacle on the road, and a roadside information transmission unit 160 which outputs the detection result for the obstacle on the road to the obstacle information processing device 300); [0043] (route information output unit 230…outputs a road radius of a road frontward in the advancing direction on a route to the destination of the vehicle, to the vehicle information transmission unit 250); [0065] (The object vehicle surrounding obstacle output unit 324 sets…the vehicle of each identification number assigned by the identification number assignment unit 322, and determines a range in which presence of an obstacle is to be sent to the object vehicle, on the basis of information of the road radius frontward in the advancing direction of the vehicle outputted from the object vehicle…the object vehicle surrounding obstacle output unit 324 outputs information of obstacles in the range on the basis of the obstacle detection result in the absolute coordinate system from the vehicle surrounding obstacle information reception unit 323); [0082] (the object vehicle surrounding obstacle output unit 324 outputs an obstacle present in the first or second obstacle output range))). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of the combination of Mochizuki and Sugimoto to incorporate the autonomous driving assistance system of Kakuta to include a roadside sensor device 100 equipped with a position output unit 140 to output position information and detection information of route(s) and obstacle(s) along the routes as well as any objects and vehicles surrounding the obstacle output unit, with predictable results, with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to combine Mochizuki, Sugimoto and Kakuta for the express benefit of including a roadside sensor device and position output unit to detect and analyze any objects immediately surrounding the vehicle along its path, as explained in Kakuta [0033], [0043], [0065], and [0082]. Regarding claim 9, Mochizuki discloses an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 1 (in claim 1, e.g. Mochizuki); and the sensor device which monitors the specific area (in claim 1, e.g. Mochizuki). Claim(s) 2-4 and 10-12 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. 20230382389 A1 to Mochizuki in view of U.S. 9631941 B2 to Sugimoto, in further view of U.S. 20230174097 A1 to Kakuta as applied to the claims above, in further view of U.S. 20230256998 A1 to Yoshinaga. Regarding claim 2, the combination of Mochizuki, Sugimoto and Kakuta discloses the autonomous driving assistance device according to claim 1 in for example the obviousness to combine in the rejection of corresponding parts of claim 1 above incorporated herein by reference. However, the combination of Mochizuki, Sugimoto and Kakuta do/does not appear to further expressly disclose further comprising an occlusion region acquisition circuity to acquire, by using the position of the object calculated by the position calculation circuity, a region that is occluded by the object. Yoshinaga, in the same field of endeavor, further discloses: an occlusion region acquisition circuity to acquire, by using the position of the object calculated by the position calculation circuity, a region that is occluded by the object (Yoshinaga discloses an occlusion region along the path where the obstacle is occluding the vehicle path ([0081] (surroundings monitoring unit 134 determines the presence or absence of an obstacle occluding the travel path based on camera images or three-dimensional point cloud data…the obstacle may be a stationary object or moving object moving at or below a predefined speed…this allows a preceding vehicle that is temporarily stopped or moving slowly due to traffic congestion or the like to be determined to be an obstacle…surroundings monitoring unit 134 notifies the transceiver unit 136 of a result of determination as to whether the travel path is occluded, and forwards the acquired sensing data to the transceiver unit 136 and the driving control unit 138); [0089] (transceiver unit 136 determines whether the result of determination at step S112 by the surroundings monitoring unit 134 indicates that the travel path is occluded); [0115] (If…the surroundings monitoring unit 134 determines that the travel path is occluded…assistance method determination unit 114A searches the location information DB 122A for point information with a “location” whose distance from the location of the AD vehicle 30 indicated by the location information included in the vehicle information is within a pre-predefined distance))). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of the combination of Mochizuki, Sugimoto and Kakuta to incorporate the autonomous driving method of Yoshinaga to include a surroundings monitoring unit 134 to determine, calculate and transmit to the vehicle control unit information of an occlusion area or area along a vehicle path where an obstacle or object has occluded the vehicle path, with predictable results, with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to combine Mochizuki, Sugimoto, Kakuta and Yoshinaga for the express benefit of a monitoring unit to monitor the conditions, obstacles and occlusion regions that may otherwise prevent the vehicle from navigating to its destination, as explained in Yoshinaga [0081], [0089] and [0115]. Regarding claim 3,Mochizuki further discloses the autonomous driving assistance device according to claim 2, further comprising a degree-of-priority calculation circuity to calculate a degree of priority for a specific area determined, by the determination circuity, to be usable (Mochizuki discloses determining a degree of priority of recognition of an object based on a calculated score of an available area ([0022] (an information collection control device for a vehicle capable of autonomous driving…the information collection control device including:…a score calculation unit configured to calculate a size of an area of overlap between the occupied areas of the object for recognition acquired before and after the time interval set by the time-interval setting unit; and a priority determination unit configured to determine a priority for transmission to the driving assistance control device based on the size of the area of overlap of the object for recognition calculated by the score calculation unit); see Figs. 4A-B; [0036] (degrees of importance of objects 24 for recognition…are quantified, and the priorities are determined by comparison of these quantified values); [0056] (score calculation unit 39 analyses this high-precision map and calculates, for each of the objects 24 for recognition, an area of overlap between the occupied areas of the same object 24 for recognition acquired before and after the predefined time interval…this corresponds to the quantified degree of importance of each of the objects 24 for recognition); [0057] (score calculation unit 39 is connected to the priority determination unit 42, and transmits a result of calculation…to the priority determination unit 42); [0058] (priority determination unit 42 compares the scores of respective objects 24 for recognition and determines priorities of the respective objects 24 for recognition for transmission to the driving assistance control unit 12))). Regarding claim 4, Mochizuki discloses the autonomous driving assistance device according to claim 3, further comprising a degree-of-priority parameter acquisition circuity to acquire degree-of-priority parameters including at least a parameter regarding a movement efficiency of an autonomous driving vehicle that moves in the specific area, wherein the degree-of-priority calculation circuity to calculate, by using each of the degree-of-priority parameters acquired by the degree-of-priority parameter acquisition circuity and a degree-of-priority weight corresponding to the degree-of-priority parameter, the degree of priority for the specific area determined to be usable (Mochizuki discloses determining degree of priority based on parameters such as the area occupied, types and sizes of the obstacles as well as the speed and distance of the obstacles and vehicle(s) from each other, with detection times of 0.1 to 1.0 second intervals (see Figs. 5 & 6A-B; [0099] (driving control unit 32 of the autonomous driving control device 20 analyses the image 26, tracks objects 24 for recognition…and applies them to a high-precision map…for autonomous driving); [0100] (autonomous driving is performed based on real-time information applied to this high-precision map...moving states of the same object 24 for recognition before and after the detection time interval indicated from the driving assistance control device 12 side are calculated); [0101] (location information depending on the change in time is acquired for a parked vehicle 24A, a traveling vehicle 24B, and a person 24C crossing a road as objects 24 for recognition in the vicinity of the vehicle 10 during autonomous driving); [0102] (detection time interval is 0.1 seconds…since the parked vehicle 24A is stationary…the area of overlap between the occupied areas of the parked vehicle 24A before and after the detection time interval is equal to the occupied area of the parked vehicle 24A (…100% occupancy). [0103] (the traveling vehicle 24B is in motion…the area of overlap between the occupied areas of the traveling vehicle 24B before and after the detection time interval is smaller than the occupied area of the traveling 24B (…10% occupancy); [0105] (the parked vehicle 24A is given priority over the traveling vehicle 24B and the person 24C); [0106] (the detection time interval is 1.0 second…the area of overlap between the occupied areas of the parked vehicle 24A before and after the detection time interval is equal to the occupied area of the parked vehicle 24A (…100% occupancy)). [0107] (the traveling vehicle 24B is in motion, the occupied areas of the traveling vehicle 24B before and after the detection time interval will not overlap))). Regarding claim 10, the combination of Mochizuki, Sugimoto, Kakuta and Yoshinaga discloses the autonomous driving assistance device according to claim 2 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-2 & 9 above incorporated herein by reference, comprising: the autonomous driving assistance device according to claim 2 (in claim(s) 1-2 & 9, e.g. Mochizuki & Yoshinaga); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-2 & 9 above incorporated herein by reference. Regarding claim 11, the combination of Mochizuki, Sugimoto, Kakuta and Yoshinaga discloses the autonomous driving assistance device according to claim 3 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-3 & 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 3 (in claim(s) 1-3 & 9, e.g. Mochizuki & Yoshinaga); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-3 & 9 above incorporated herein by reference. Regarding claim 12, the combination of Mochizuki, Sugimoto, Kakuta and Yoshinaga discloses the autonomous driving assistance device according to claim 3 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-4 & 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 4 (in claim (s) 1-4 & 9, e.g. Mochizuki & Yoshinaga); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-4 & 9 above incorporated herein by reference. Claim(s) 5-8, 13-20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. 20230382389 A1 to Mochizuki in view of U.S. 9631941 B2 to Sugimoto, in further view of U.S. 20230174097 A1 to Kakuta as applied to the claims above, in further view of U.S. 20180023966 A1 to Iwai et al. (Iwai). Regarding claim 5, the combination of Mochizuki, Sugimoto and Kakuta discloses the autonomous driving assistance device according to claim 1 in for example the obviousness to combine in the rejection of corresponding parts of claim 1 above incorporated herein by reference. However, the combination of Mochizuki, Sugimoto and Kakuta do/does not appear to further disclose: a restriction information acquisition circuity to acquire restriction information as to whether or not the specific area is usable. Iwai, in the same field of endeavor, further discloses further comprising a restriction information acquisition circuity to acquire restriction information as to whether or not the specific area is usable (Iwai discloses traffic information storage unit 43 which stores restriction information pertaining to road conditions, traffic and weather…transmitted by the Vehicle Information Communication System (VICS) center and server 2 which stores peripheral state information indicating restrictions surrounding a road ([0033] (server device 2 stores the enhanced map DB 21 and updates the enhanced map DB 21 by receiving the sensor information D2 from each of the driving assistance devices 1…enhanced map DB 21 includes road data, traffic information, peripheral state information and weather information…map information on positions and shapes of roads, traffic lanes, traffic intersection, signs and buildings…traffic information indicates traffic conditions of congested roads and the like…peripheral state information indicates the state of surroundings of each road such as a lane restriction due to a traffic accident or constructions…weather information indicates present or forecasted weather on each road); [0038] (determination unit 13 determines the position of the own vehicle, the state of the own vehicle and the peripheral state of the own vehicle based on the output signals from the sensor unit 12…determination unit 13 recognizes a peripheral state of the road where the own vehicle is running, wherein examples of the peripheral state include a traffic accident location, a restricted lane and a newly established lane); [0042] (traffic information storage unit 43 stores traffic information on traffic jams and traffic restrictions included in the enhanced map information D1...traffic information storage unit 43 may receive and store traffic information on traffic jams and traffic restrictions delivered from a VICS…center…weather information storage unit 44 stores weather information included in the enhanced map information D1 or weather information corresponding to the own vehicle position and its periphery))). Therefore, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to have modified the system of the combination of Mochizuki, Sugimoto and Kakuta to incorporate the map information processing device of Iwai to include a server device 2 and traffic information storage unit 43 to store and include restriction information pertaining to weather conditions, road conditions and/or traffic conditions along a route the vehicle is travelling so the vehicle can make better decisions based on potential and/or present hazards and restrictions, with predictable results, with a reasonable expectation of success. One of ordinary skill in the art would have been motivated to combine Mochizuki, Sugimoto, Kakuta and Iwai for the express benefit of determining and storing information pertaining to road, traffic and weather conditions on the road for which the autonomous vehicle is travelling, as explained in Iwai [0033], [0038] and [0042]. Regarding claim 6, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 2 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-2 and 5 above incorporated herein by reference, further comprising a restriction information acquisition circuity to acquire restriction information as to whether or not the specific area is usable (in claim(s) 1-2 & 5, e.g. Mochizuki, Yoshinaga & Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-2 & 5 above incorporated herein by reference. Regarding claim 7, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 3 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-3 & 5-6 above incorporated herein by reference, further comprising a restriction information acquisition circuity to acquire restriction information as to whether or not the specific area is usable (in claim(s) 1-3 & 5-6, e.g. Mochizuki, Yoshinaga & Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim 1-3 & 5-6 above incorporated herein by reference. Regarding claim 8, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 4 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-7 above incorporated herein by reference, further comprising a restriction information acquisition circuity to acquire restriction information as to whether or not the specific area is usable (in claim(s) 1-7, e.g. Mochizuki, Yoshinaga & Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-7 above incorporated herein by reference. Regarding claim 13, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 5 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1, 5 and 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 5 (in claim(s) 1, 5 & 9, e.g. Mochizuki & Iwai); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1, 5 & 9 above incorporated herein by reference. Regarding claim 14, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 6 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-2, 5-6 and 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 6 (in claim(s) 1-2, 5-6 & 9, e.g. Mochizuki, Yoshinaga & Iwai); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-2, 5-6 & 9 above incorporated herein by reference. Regarding claim 15, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 7 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-3, 5-7 and 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 7 (in claim(s) 1-3, 5-7 & 9, e.g. Mochizuki, Yoshinaga & Iwai); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-3, 5-7 & 9 above incorporated herein by reference. Regarding claim 16, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 8 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-2, 5-6 and 9 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 8 (in claim(s) 1-9, e.g. Mochizuki, Yoshinaga & Iwai); and the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-7 above incorporated herein by reference. Regarding claim 17, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 5 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1, 5, 9 and 13 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 5 (in claim(s) 1, 5, 9 & 13, e.g. Mochizuki & Iwai); the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki). Mochizuki further discloses: an operation device through which information is inputted to the autonomous driving assistance device (Mochizuki discloses input device 40A to input information to the driving control device 12 ([0066] (user interface 40 is equipped with an input device (a keyboard, a mouse, a microphone, etc.) 40A); [0068] (Based on the images displayed on the monitor 40B, the operator OP determines what kind of assistance is to be provided and inputs assistance instruction information through the input device 40A); [0069] (assistance control unit 52 receives the assistance instruction information received from the input device 40A and transmits it to the autonomous driving control device 20 via the center-side communication unit 50))), wherein the restriction information acquisition circuity acquires, from the operation device, the restriction information as to whether or not the specific area is usable (in claim 5, e.g. Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1, 5, 9 & 13 above incorporated herein by reference. Regarding claim 18, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 6 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-2, 5-6, 9, 14 & 17 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 6 (in claims 1-2, 5-6, 9 & 14 e.g. Mochizuki, Yoshinaga & Iwai); the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki); and an operation device through which information is inputted to the autonomous driving assistance device (in claim 17, e.g. Mochizuki), wherein the restriction information acquisition circuity acquires, from the operation device, the restriction information as to whether or not the specific area is usable (in claim 5, e.g. Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-2, 5-6, 9, 14 & 17 above incorporated herein by reference. Regarding claim 19, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 6 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-3, 5-7, 9, 15 & 17-18 and above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 7 (in claim(s) 1-3, 5-7, 9 & 15, e.g. Mochizuki, Yoshinaga & Iwai); the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki); and an operation device through which information is inputted to the autonomous driving assistance device (in claim 17, e.g. Mochizuki), wherein the restriction information acquisition circuity acquires, from the operation device, the restriction information as to whether or not the specific area is usable (in claim 5, e.g. Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-3, 5-7, 9, 15 & 17 above incorporated herein by reference. Regarding claim 20, the combination of Mochizuki, Sugimoto, Kakuta, Yoshinaga and Iwai discloses the autonomous driving assistance device according to claim 8 in for example the obviousness to combine in the rejection of corresponding parts of claim(s) 1-9 and 16-17 above incorporated herein by reference, an autonomous driving assistance system comprising: the autonomous driving assistance device according to claim 8 (in claim(s) 1-9 & 16, e.g. Mochizuki, Yoshinaga & Iwai); the sensor device which monitors the specific area (in claim 9, e.g. Mochizuki); and an operation device through which information is inputted to the autonomous driving assistance device (in claim 17, e.g. Mochizuki), wherein the restriction information acquisition circuity acquires, from the operation device, the restriction information as to whether or not the specific area is usable (in claim 5, e.g. Iwai). It would have been obvious to combine for the reasons set forth in the rejection of corresponding parts of claim(s) 1-9 & 16-17 above incorporated herein by reference. Conclusion The prior art made of record and not relied upon is considered pertinent to Applicant’s disclosure as teaching the state of the art of autonomous driving assistance device(s) and autonomous driving assistance system(s), at the time of filing. For example: US 20190139406 A1 to Adachi; Yoshiaki teaches, inter alia Driving Assistance Device in for example the ABSTRACT, Figures and/or Paragraphs below: “A plurality of items of environmental information, which are used for distribution control of plural items of vehicle information that are received using wireless communication by an external communication device from a plurality of surrounding vehicles, are made into an environmental information list by an environmental information setting mechanism and managed. Based on the environmental information list, for the plural items of vehicle information received by the external equipment communication device, selection is performed using a dynamic filtering part of a dynamic distribution controller and distribution control is performed using the priority level according to a dynamic priority controller. A driving assistance device that receives this information assists a driver in driving, and environmental information set in the environmental information list is generated.” PNG media_image2.png 435 521 media_image2.png Greyscale PNG media_image3.png 741 415 media_image3.png Greyscale US 20200004269 A1 to Oba; Eiji teaches, inter alia TRAVELING ASSISTANCE DEVICE, TRAVELING ASSISTANCE MANAGEMENT DEVICE, METHODS OF SAME DEVICES, AND TRAVELING ASSISTANCE SYSTEM in for example the ABSTRACT, Figures and/or Paragraphs below: “A traveling assistance device provided on a traveling assistance target vehicle 11 includes: an outside-vehicle information acquisition unit that acquires outside-vehicle information; a communication unit that communicates with a traveling assistance management device 15 providing leading vehicle information; and a traveling control unit that performs following traveling control for traveling while following a leading vehicle indicated by leading vehicle information acquired from the traveling assistance management device. The following traveling control is performed using the outside-vehicle information acquired by the outside-vehicle information acquisition unit and the leading vehicle information. The traveling assistance management device 15 that manages the traveling assistance at a position away from the vehicle includes: a communication unit that communicates with the traveling assistance target vehicle 11; and an information processing unit that selects, as a leading vehicle, a candidate vehicle 12-1 scheduled to travel in a traveling schedule route of the traveling assistance target vehicle at a traveling schedule time of the traveling assistance target vehicle from candidate vehicles in response to a request for leading vehicle information from the traveling assistance target vehicle, and notifies the traveling assistance target vehicle of leading vehicle information indicating the leading vehicle. Following traveling is automatically and efficiently achievable.” PNG media_image4.png 429 629 media_image4.png Greyscale PNG media_image5.png 269 704 media_image5.png Greyscale Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT L PINKERTON whose telephone number is (571)272-9820. The examiner can normally be reached M-TH 9:00-4:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached on 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT L PINKERTON/Examiner, Art Unit 3665 /DANIEL L GREENE/Primary Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Oct 04, 2024
Application Filed
Jan 23, 2026
Non-Final Rejection — §101, §103
Apr 06, 2026
Response Filed

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12498252
WORLD MODEL GENERATION AND CORRECTION FOR AUTONOMOUS VEHICLES
2y 5m to grant Granted Dec 16, 2025
Patent 12497080
METHOD AND SYSTEM FOR CONTROLLING AUTONOMOUS OR SEMI-AUTONOMOUS VEHICLE
2y 5m to grant Granted Dec 16, 2025
Patent 12492916
POSITIONING OPERATION BASED ON FILTERED MAP DATA
2y 5m to grant Granted Dec 09, 2025
Patent 12493296
MOBILITY PLATFORM FOR AUTONOMOUS NAVIGATION OF WORKSITES
2y 5m to grant Granted Dec 09, 2025
Patent 12480781
CROWD-SOURCING LANE LINE MAPS FOR A VEHICLE
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
86%
Grant Probability
99%
With Interview (+22.1%)
2y 8m
Median Time to Grant
Low
PTA Risk
Based on 70 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month