Prosecution Insights
Last updated: April 19, 2026
Application No. 18/632,037

AUTOMATED TRAVEL DEVICE

Non-Final OA §103§112
Filed
Apr 10, 2024
Examiner
MCCULLERS, AARON KYLE
Art Unit
3663
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
DENSO CORPORATION
OA Round
1 (Non-Final)
44%
Grant Probability
Moderate
1-2
OA Rounds
3y 5m
To Grant
77%
With Interview

Examiner Intelligence

Grants 44% of resolved cases
44%
Career Allow Rate
32 granted / 72 resolved
-7.6% vs TC avg
Strong +33% interview lift
Without
With
+32.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
30 currently pending
Career history
102
Total Applications
across all art units

Statute-Specific Performance

§101
10.5%
-29.5% vs TC avg
§103
57.1%
+17.1% vs TC avg
§102
12.5%
-27.5% vs TC avg
§112
18.2%
-21.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 72 resolved cases

Office Action

§103 §112
DETAILED ACTION This action is in reply to an application filed April 10th, 2024. Claims 1-9 are currently pending. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on April 10th, 2024 and September 4th, 2024 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claims 1 and 6-9 are objected to because of the following informalities: the claims recite in lines 9-10 of each claim a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge which is poor grammar and should be a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is another vehicle attempting to merge. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “vehicle control unit” in claims 1-9, “merge-point recognition unit” in claims 1 and 6-9, and “surrounding vehicle recognition unit” in claims 1 and 3-9. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 1-9 rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1-9 recite the element “vehicle control unit” which invokes 112(f) but lacks supporting structure in the original disclosure. Claims 1 and 6-9 recite the element “merge-point recognition unit” which invokes 112(f) but lacks supporting structure in the original disclosure. Claims 1 and 3-9 recite the element “surrounding vehicle recognition unit” which invokes 112(f) but lacks supporting structure in the original disclosure. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 1-9 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim limitations “vehicle control unit”, “merge-point recognition unit”, and “surrounding vehicle recognition unit” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. There is no recitations of any corresponding structure, material, or acts for performing the entire claimed function in the original disclosure. Therefore, the claim is indefinite and is rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 3-5, and 7 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi; Komaru (JP Pub. No. 2018169895 A), herein after Takashi, and further in view of Okuyama et al. (US Pub. No. 20200231158 A1), herein after Okuyama. Regarding claim 1, Takashi teaches [a]n automated travel device, comprising (Takashi: Para. 0038, teaching a device for automatically controlling the operations of a vehicle): a vehicle control unit configured to execute automated driving control to allow a subject vehicle to travel autonomously along a predetermined scheduled travel route based on a signal from a surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle); a merge-point recognition unit configured to recognize a merge point with a main road, on which the subject vehicle travels, based on at least one of map data, a detection result of the surrounding monitoring sensor, or data acquired from an external apparatus via wireless communications (Takashi: Para. 0058, teaching that the vehicle is controlled automatically to merge into a lane based on information on the surroundings acquired from sensors); and a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge at the merge point into the main road, based on at least one of the detection result of the surrounding monitoring sensor or the data acquired from the external apparatus via wireless communications (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the host vehicle based on other vehicles in the lane that the host vehicle is merging into), wherein the vehicle control unit is configured, during execution of the automated driving control, to perform control to change a running position of the subject vehicle with respect to a surrounding vehicle on the main road, when a remaining distance to the merge point reaches less than a predetermined value, the control to change the running position of the subject vehicle with respect to the surrounding vehicle on the main road includes control to increase temporarily an inter-vehicle distance from a preceding vehicle by a predetermined amount, during execution of the automated driving control, and when the surrounding vehicle recognition unit recognizes the merging vehicle (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Takashi is silent to the vehicle control unit is configured, during execution of the automated driving control, to perform a takeover request to an occupant on a driver seat to ask a driving operation, when a predetermined number of merging vehicles merge between the subject vehicle and a preceding vehicle, and when the merging vehicles remain. In a similar field, Okuyama teaches the vehicle control unit is configured, during execution of the automated driving control, to perform a takeover request to an occupant on a driver seat to ask a driving operation, when a predetermined number of merging vehicles merge between the subject vehicle and a preceding vehicle, and when the merging vehicles remain (Okuyama: Para. 0069 and 0071, teaching that during a merging operation of the host vehicle to another lane, the system hands over control of the vehicle from the autonomous system to the driver based on the congestion of the lane the vehicle is merging into) for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. It would have been obvious to one ordinarily skilled in the art before the filing of the application to modify the autonomous merging operation from Takashi to hand over control of the vehicle to the driver when the merging lane is congested, as taught by Okuyama, for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. Regarding claim 3, Takashi and Okuyama remain as applied in claim 1, and Takashi goes on to further teach [t]he automated travel device according to claim 1, wherein the vehicle control unit is configured, during execution of the automated driving control, to temporarily increase a set value of the inter-vehicle distance, which is a control target of the automated driving control, by a predetermined amount, when the surrounding vehicle recognition unit recognizes the merging vehicle (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Regarding claim 4, Takashi and Okuyama remain as applied in claim 1, and Takashi goes on to further teach [t]he automated travel device according to claim 1, wherein the vehicle control unit is configured to, during execution of the automated driving control, temporarily reduce speed by a predetermined amount, when the surrounding vehicle recognition unit recognizes the merging vehicle (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Regarding claim 5, Takashi and Okuyama remain as applied in claim 1, and Takashi goes on to further teach [t]he automated travel device according to claim 1, wherein the surrounding vehicle recognition unit is configured to recognize the preceding vehicle based on an input signal from the surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle), and the vehicle control unit is configured to, during execution of the automated driving control, and when the merging vehicle is recognized, perform control to increase an inter-vehicle distance from the preceding vehicle by a predetermined amount, when the preceding vehicle exists, select whether to maintain a current speed or reduce the current speed according to a relative position and a relative speed of the merging vehicle with respect to the subject vehicle, when the preceding vehicle does not exist (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Regarding claim 7, Takashi teaches [a]n automated travel device, comprising (Takashi: Para. 0038, teaching a device for automatically controlling the operations of a vehicle): a vehicle control unit configured to execute automated driving control to allow a subject vehicle to travel autonomously along a predetermined scheduled travel route based on a signal from a surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle); a merge-point recognition unit configured to recognize a merge point with a main road, on which the subject vehicle travels, based on at least one of map data, a detection result of the surrounding monitoring sensor, or data acquired from an external apparatus via wireless communications (Takashi: Para. 0058, teaching that the vehicle is controlled automatically to merge into a lane based on information on the surroundings acquired from sensors); and a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge at the merge point into the main road, based on at least one of the detection result of the surrounding monitoring sensor or the data acquired from the external apparatus via wireless communications (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the host vehicle based on other vehicles in the lane that the host vehicle is merging into), wherein the vehicle control unit is configured, during execution of the automated driving control, to perform control to change a running position of the subject vehicle with respect to a surrounding vehicle on the main road, when a remaining distance to the merge point reaches less than a predetermined value (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Takashi is silent to the vehicle control unit is configured to perform merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the surrounding vehicle recognition unit recognizes the merging vehicle, during execution of the automated driving control on an ordinary road, and not to perform the merge development control, during execution of the automated driving control on a limited-access road. In a similar field, Okuyama teaches the vehicle control unit is configured to perform merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the surrounding vehicle recognition unit recognizes the merging vehicle, during execution of the automated driving control on an ordinary road, and not to perform the merge development control, during execution of the automated driving control on a limited-access road (Okuyama: Para. 0003 and 0005, teaching that autonomous control for merging operations is allowed or manual control is required based on the type of road) for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. It would have been obvious to one ordinarily skilled in the art before the filing of the application to modify the autonomous merging operation from Takashi to hand over control of the vehicle to the driver when the lane that the vehicle is merging into is not a normal road, as taught by Okuyama, for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. Claim 2 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi in view of Okuyama as applied to claim 1 above, and further in view of Mimura et al. (US Pub. No. 20170315551 A1), herein after Mimura. Regarding claim 2, Takashi and Okuyama remain as applied as in claim 1, and Okuyama goes on to further teach [t]he automated travel device according to claim 1, wherein the automated travel device is to be connected with an outward display device that is configured to display an image in an area, which is visually recognizable by a driver of the merging vehicle (Okuyama: Para. 0130, teaching a display that shows the area around the vehicle). They are silent to the vehicle control unit is configured, during execution of the automated driving control, to change an operation manner of the outward display device depending on a number of merging vehicles merging between the subject vehicle and the preceding vehicle. In a similar field, Mimura teaches the vehicle control unit is configured, during execution of the automated driving control, to change an operation manner of the outward display device depending on a number of merging vehicles merging between the subject vehicle and the preceding vehicle (Mimura: Para. 0134, teaching that the display shows the vehicles in the area around the host vehicle) for the benefit of enhanced awareness for the driver of the surroundings during merging operations. It would have been obvious to one ordinarily skilled in the art before the effective filing date of the applicant’s claimed invention to modify the merging operations from Takashi to display the vehicles around the vehicle to the driver, as taught by Mimura, for the benefit of enhanced awareness for the driver of the surroundings during merging operations. Claim 6 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi and further in view of Shalev-Shwartz et al. (US Pub. No. 20210200235 A1), herein after Shalev-Shwartz. Regarding claim 6, Takashi teaches [a]n automated travel device, comprising (Takashi: Para. 0038, teaching a device for automatically controlling the operations of a vehicle): a vehicle control unit configured to execute automated driving control to allow a subject vehicle to travel autonomously along a predetermined scheduled travel route based on a signal from a surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle); a merge-point recognition unit configured to recognize a merge point with a main road, on which the subject vehicle travels, based on at least one of map data, a detection result of the surrounding monitoring sensor, or data acquired from an external apparatus via wireless communications (Takashi: Para. 0058, teaching that the vehicle is controlled automatically to merge into a lane based on information on the surroundings acquired from sensors); and a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge at the merge point into the main road, based on at least one of the detection result of the surrounding monitoring sensor or the data acquired from the external apparatus via wireless communications (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the host vehicle based on other vehicles in the lane that the host vehicle is merging into), wherein the vehicle control unit is configured, during execution of the automated driving control, to perform control to change a running position of the subject vehicle with respect to a surrounding vehicle on the main road, when a remaining distance to the merge point reaches less than a predetermined value (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Takashi is silent to the vehicle control unit is configured to, during execution of the automated driving control, and when the surrounding vehicle recognition unit recognizes the merging vehicle, execute front space shortening control to decrease temporarily an inter-vehicle distance from a preceding vehicle by a predetermined amount, when no traffic congestion occurs, and execute front space extension control to increase the inter-vehicle distance from the preceding vehicle by a predetermined amount, when traffic congestion occurs. In a similar field, Shalev-Shwartz teaches the vehicle control unit is configured to, during execution of the automated driving control, and when the surrounding vehicle recognition unit recognizes the merging vehicle, execute front space shortening control to decrease temporarily an inter-vehicle distance from a preceding vehicle by a predetermined amount, when no traffic congestion occurs, and execute front space extension control to increase the inter-vehicle distance from the preceding vehicle by a predetermined amount, when traffic congestion occurs (Shalev-Shwartz: Para. 0632 and 0838, teaching that the separation distance between the host vehicle and preceding vehicle is adjusted based on the level of congestion of the road) for the benefit of improved safety of the vehicle during autonomous merging operations. It would have been obvious to one ordinarily skilled in the art before the filing of the application to modify the autonomous merging operation from Takashi to adjust the distance between the host vehicle and the preceding vehicle based on how congested the traffic is, as taught by Okuyama, for the benefit of improved safety of the vehicle during autonomous merging operations. Claims 8-9 are rejected under 35 U.S.C. 103 as being unpatentable over Takashi, and further in view of Ueda et al. (US Pub. No. 20210300317 A1), herein after Ueda. Regarding claim 8, Takashi teaches [a]n automated travel device, comprising (Takashi: Para. 0038, teaching a device for automatically controlling the operations of a vehicle): a vehicle control unit configured to execute automated driving control to allow a subject vehicle to travel autonomously along a predetermined scheduled travel route based on a signal from a surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle); a merge-point recognition unit configured to recognize a merge point with a main road, on which the subject vehicle travels, based on at least one of map data, a detection result of the surrounding monitoring sensor, or data acquired from an external apparatus via wireless communications (Takashi: Para. 0058, teaching that the vehicle is controlled automatically to merge into a lane based on information on the surroundings acquired from sensors); and a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge at the merge point into the main road, based on at least one of the detection result of the surrounding monitoring sensor or the data acquired from the external apparatus via wireless communications (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the host vehicle based on other vehicles in the lane that the host vehicle is merging into)…, wherein the vehicle control unit is configured, during execution of the automated driving control, to perform control to change a running position of the subject vehicle with respect to a surrounding vehicle on the main road, when a remaining distance to the merge point reaches less than a predetermined value (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Takashi is silent to a driver condition determination unit configured to determine whether the driver performs a second task based on an image of a driver taken by a cabin camera… and the vehicle control unit is configured, during execution of the automated driving control, and when the surrounding vehicle recognition unit recognizes the merging vehicle, to perform merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the driver performs the second task, and not perform the merge development control when the driver does not perform the second task. In a similar field, Ueda teaches a driver condition determination unit configured to determine whether the driver performs a second task based on an image of a driver taken by a cabin camera (examiner interprets that the second task can be any task that indicates that the autonomous vehicle may perform the autonomous action such as the driver paying attention to the road) (Okuyama: Para. 0090, teaching monitoring the attention of a driver to the road)…, and the vehicle control unit is configured, during execution of the automated driving control, and when the surrounding vehicle recognition unit recognizes the merging vehicle, to perform merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the driver performs the second task, and not perform the merge development control when the driver does not perform the second task (Ueda: Para. 0090, teaching monitoring the attention of a driver to the road; and Para. 0092, teaching that whether the autonomous operation of the vehicle is conducted or if the control is handed over the driver is determined based on whether the driver is paying attention to the road) for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. It would have been obvious to one ordinarily skilled in the art before the filing of the application to modify the autonomous merging operation from Takashi to determine whether or not to perform the autonomous operation based on if the driver is performing the task of paying attention, as taught by Ueda, for the benefit of preventing manual operations when the user is not prepared to take over control of the vehicle. Regarding claim 9, Takashi teaches [a]n automated travel device, comprising (Takashi: Para. 0038, teaching a device for automatically controlling the operations of a vehicle): a vehicle control unit configured to execute automated driving control to allow a subject vehicle to travel autonomously along a predetermined scheduled travel route based on a signal from a surrounding monitoring sensor (Takashi: Para. 0045, teaching that the control of the vehicle is based on sensors the monitor the surroundings of the vehicle); a merge-point recognition unit configured to recognize a merge point with a main road, on which the subject vehicle travels, based on at least one of map data, a detection result of the surrounding monitoring sensor, or data acquired from an external apparatus via wireless communications (Takashi: Para. 0058, teaching that the vehicle is controlled automatically to merge into a lane based on information on the surroundings acquired from sensors); and a surrounding vehicle recognition unit configured to recognize a merging vehicle, which is an other vehicle attempting to merge at the merge point into the main road, based on at least one of the detection result of the surrounding monitoring sensor or the data acquired from the external apparatus via wireless communications (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the host vehicle based on other vehicles in the lane that the host vehicle is merging into)…, the vehicle control unit is configured, during execution of the automated driving control, to perform control to change a running position of the subject vehicle with respect to a surrounding vehicle on the main road, when a remaining distance to the merge point reaches less than a predetermined value (Takashi: Para. 0089, teaching that during merging of the vehicle into a lane, the system adjusts the speed of the host vehicle and the separation distance between it and the other vehicles in the lane that the host vehicle is merging into). Takashi is silent to a driver condition determination unit configured to determine whether the driver performs a second task based on an image of a driver taken by a cabin camera… and the vehicle control unit is configured, during execution of the automated driving control, when the surrounding vehicle recognition unit recognizes the merging vehicle, to execute processing to inquire the driver whether the driver permits merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the driver performs the second task. In a similar field, Ueda teaches a driver condition determination unit configured to determine whether the driver performs a second task based on an image of a driver taken by a cabin camera (examiner interprets that the second task can be any task that indicates that the autonomous vehicle may perform the autonomous action such as the driver paying attention to the road) (Okuyama: Para. 0090, teaching monitoring the attention of a driver to the road)…, and the vehicle control unit is configured, during execution of the automated driving control, when the surrounding vehicle recognition unit recognizes the merging vehicle, to execute processing to inquire the driver whether the driver permits merge development control to prompt the merging vehicle to merge in front of the subject vehicle, when the driver performs the second task (Ueda: Para. 0090, teaching monitoring the attention of a driver to the road; Para. 0091, teaching that the switch from autonomous operations to manual operations is performed if the driver allows autonomous operation via an operation switch; and Para. 0092, teaching that whether the autonomous operation of the vehicle is conducted or if the control is handed over the driver is determined based on whether the driver is paying attention to the road) for the benefit of allowing the driver to merge in a situation where it is difficult for the autonomous system to act. It would have been obvious to one ordinarily skilled in the art before the filing of the application to modify the autonomous merging operation from Takashi to determine whether or not to perform the autonomous operation based on if the driver is performing the task of paying attention when they operate a device that initiates manual control of the vehicle, as taught by Ueda, for the benefit of preventing manual operations when the user is not prepared to take over control of the vehicle. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Sato; Katsuhiko (US Pub. No. 20200307600 A1) teaches handing over control of the vehicle from autonomous control to manual control during lane changing operations when autonomous operations are not feasible. Hashimoto et al. (US Pub. No. 20220001888 A1) teaches determining appropriate locations to hand over control of the vehicle autonomous control to manual control. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Aaron K McCullers whose telephone number is (571)272-3523. The examiner can normally be reached Monday - Friday, roughly 9 AM - 6 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Angela Ortiz can be reached at (571) 272-1206. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.K.M./Examiner, Art Unit 3663 /ANGELA Y ORTIZ/Supervisory Patent Examiner, Art Unit 3663
Read full office action

Prosecution Timeline

Apr 10, 2024
Application Filed
Dec 13, 2025
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12576724
ELECTRIC POWER EQUIPMENT
2y 5m to grant Granted Mar 17, 2026
Patent 12517508
INFORMATION TERMINAL, CONTROL SYSTEM, AND CONTROL METHOD
2y 5m to grant Granted Jan 06, 2026
Patent 12503252
METHOD FOR AUTONOMOUS MISSION PLANNING OF CARBON SATELLITE
2y 5m to grant Granted Dec 23, 2025
Patent 12497077
INTERPRETABLE KALMAN FILTER COMPRISING NEURAL NETWORK COMPONENT(S) FOR AUTONOMOUS VEHICLES
2y 5m to grant Granted Dec 16, 2025
Patent 12454444
WORK MACHINE AND METHOD FOR CONTROLLING WORK MACHINE
2y 5m to grant Granted Oct 28, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
44%
Grant Probability
77%
With Interview (+32.8%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 72 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month