Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This action is in reply to the response filed on December 9, 2025.
Claims 1 and 4 are currently pending and have been examined.
Claims 2-3 have been canceled by the applicant.
This action is made FINAL.
The examiner would like to note that this application is being handled by examiner Christine Huynh.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on December 3, 2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Amendment
The amendment filed December 9, 2025 has been entered. Claims 1 and 4 remain pending in the application. Applicant’s amendments to the Claims have overcome the 35 U.S.C. 112(b) rejection and the 35 U.S.C. 101 set forth in the Non-Final Office Action mailed October 2, 2025.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1, regarding amended claim “a candidate vehicle detection unit configured to, based on the determination unit determining that there is at least one of a branch point and an intersection in front of the host vehicle, detect whether there are a plurality of control target candidate vehicles which are vehicles within a preset detection area of the external sensor” and the order of operations in which a plurality of control target candidate vehicles are detected after determining that there is at least one of a branch point and an intersection have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. However, upon further search and consideration, the amended claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Lee (US 20170057504 A1), Namba (US 20220242408 A1), and Oyama (US 20200310427 A1). See detailed rejection below.
Applicant's arguments filed December 9, 2025 have been fully considered but they are not persuasive. Regarding the amended claim “a viewing direction recognition unit configured to, based on the candidate vehicle detection unit detecting that there are a plurality of control target candidate vehicles, recognize a viewing direction of a driver of the host vehicle based on a captured image of a driver monitor camera of the host vehicle, and determine a viewing area of the driver based on the viewing direction of the driver”, the applicant argues that Lee (US 20170057504 A1) in view of Namba (US 20220242408 A1) do not teach this limitation because Lee does not teach detecting that there a plurality of control target candidate vehicles before determining a viewing direction of the driver (pages 13-14). However, the examiner respectfully disagrees, as Lee teaches (“That is, as described in FIGS. 10 and 11, when the vehicle M of the driver enters the lane on which the other vehicle B is located, the control unit 300 may compare the information about the position of the other vehicle B and the information about the visual line P3 of the driver. At this time, when an angle of the other vehicle B detected by the preceding-vehicle detection unit 200 is matched with an angle of the visual line P3 of the driver which is detected by the driver visual line detection unit 100, the control unit 300 may choose the other vehicle B as the target vehicle. When the other vehicle B is chosen as the target vehicle, the control unit 300 may control the traveling speed of the vehicle M of the driver in order to control an inter-vehicle distance between the vehicle M of the driver and the target vehicle B.” [0151-0152]), where there are multiple vehicles, vehicles A, B, and C, present before a viewing direction of the driver is chosen and a target vehicle is chosen to follow based on the viewing direction. While the viewing direction in Lee can also be determined when there is at least one vehicle detected, the viewing direction can still be determined especially when a plurality of vehicles detected in order to choose one vehicle as a target vehicle. Thus, it would have been obvious to a person of ordinary skill in the art where the viewing direction of the driver is determined when a plurality of vehicles are detected in an attempt to provide an improved system or method, as a person with ordinary skill has good reason to pursue the known options within his or her technical grasp. In turn, because the product as claimed has the properties predicted by the prior art, it would have been obvious to make the system or product where the viewing direction of the driver is determined when a plurality of vehicles are detected. Therefore, claim 1 is rejected under 35 U.S.C. 103 as being unpatentable over Lee (US 20170057504 A1), Namba (US 20220242408 A1), and Oyama (US 20200310427 A1). See detailed rejection below.
Dependent claim 4 is rejected for the same reasons above due to dependency. See detailed rejection below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 1 and 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over in view of Lee (US 20170057504 A1), Namba (US 20220242408 A1), and Oyama (US 20200310427 A1).
Regarding claim 1 and 4:
With respect to claim 1, Lee teaches:
a viewing direction recognition unit configured to, based on the candidate vehicle detection unit detecting that there are a plurality of control target candidate vehicles, recognize a viewing direction of a driver of the host vehicle based on a captured image of a driver monitor camera of the host vehicle, and determine a viewing area of the driver based on the viewing direction of the driver; (“That is, as described in FIGS. 10 and 11, when the vehicle M of the driver enters the lane on which the other vehicle B is located, the control unit 300 may compare the information about the position of the other vehicle B and the information about the visual line P3 of the driver. At this time, when an angle of the other vehicle B detected by the preceding-vehicle detection unit 200 is matched with an angle of the visual line P3 of the driver which is detected by the driver visual line detection unit 100, the control unit 300 may choose the other vehicle B as the target vehicle. When the other vehicle B is chosen as the target vehicle, the control unit 300 may control the traveling speed of the vehicle M of the driver in order to control an inter-vehicle distance between the vehicle M of the driver and the target vehicle B.” [0151-0152]), where there are multiple vehicles, vehicles A, B, and C, present before a viewing direction of the driver is chosen and a target vehicle is chosen to follow based on the viewing direction. While the viewing direction in Lee can also be determined when there is at least one vehicle detected, the viewing direction can still be determined especially when a plurality of vehicles detected in order to choose one vehicle as a target vehicle. Thus, it would have been obvious to a person of ordinary skill in the art where the viewing direction of the driver is determined when a plurality of vehicles are detected in an attempt to provide an improved system or method, as a person with ordinary skill has good reason to pursue the known options within his or her technical grasp. In turn, because the product as claimed has the properties predicted by the prior art, it would have been obvious to make the system or product where the viewing direction of the driver is determined when a plurality of vehicles are detected.
Lee does not teach, but Namba teaches:
a determination unit configured to determine whether there is a branch point and an intersection in front of the host vehicle, (“The intersection detector is configured to detect, on the basis of a traveling environment in a predetermined range of a target traveling course in front of the vehicle, a set of intersections whose interval is shorter than a predetermined interval within the predetermined range of the target traveling course in front of the vehicle.” [0006], “The circuitry is configured to detect, on the basis of a traveling environment in a predetermined range of a target traveling course in front of the vehicle, a set of intersections whose interval is shorter than a predetermined interval within the predetermined range of the target traveling course in front of the vehicle.” [0007]), which teaches determining that an intersection is in front of the host vehicle.
It would have been obvious to one of ordinary skill in the art before the effective filling date of the instant application to have combined Lee’s candidate vehicle detection unit with Namba’s intersection determination because (“The front-traveling-environment recognition section 21d may read the traveling-environment image data subjected to the image processing by the IPU 21c to recognize a front traveling environment (i.e., front-traveling-environment data) on the basis of the traveling-environment image data.” See Namba [0029]), in order to detect the surrounding environment of the host vehicle.
However, Lee and Namba do not teach, but Oyama teaches:
a candidate vehicle detection unit configured to, based on the determination unit determining that there is at least one of a branch point and an intersection in front of the host vehicle, detect whether there are a plurality of control target candidate vehicles which are vehicles within a preset detection area of the external sensor; (“The first and second sensors 22c and 22d are disposed on both sides of the front side of the bumper, so as to detect the surrounding environment information in the front direction and the right and left directions of a predetermined angular range of the own vehicle M.” [0038], “When the vehicle speed of the own vehicle Min the first stop area Sa becomes zero, other vehicles to enter the “4-WAY STOP” intersection are recognized (step S3). In this processing, other vehicles are recognized based on the image information transmitted from the IPU 23 to the front driving environment recognizer 24 and also based on the surrounding environment information transmitted from the first and second sensors 22c and 22d to the surrounding driving environment recognizer 26. At this time, the number of zero to three of other vehicles entered into the intersection and are currently stopping at the vehicle speed zero in the second to fourth stop areas Sb, Sc, and Sd is specified.” [0051], and FIG. 4, where S1 shows that it is determined that there is at least one of a branch point and an intersection in front of the host vehicle, and afterwards S3 detects if other vehicles are present, in which the detection of other vehicles is comparable to detecting whether there are a plurality of control target candidate vehicles within a preset detection area of the external sensor. The detection of a plurality of vehicles is only determined after it is determined that the host vehicle is at an intersection.
It would have been obvious to one of ordinary skill in the art before the effective filling date of the instant application to have combined Lee’s candidate vehicle detection unit with Oyama’s determining a plurality of vehicles at an intersection because (“An aspect of the technology provides an intersection start judgment device that judges whether to start an own vehicle at an intersection at which vehicles in all directions need to temporarily stop.” See Oyama [0005]), for better sensing the surrounding environment when the host vehicle is at a detected intersection.
Lee further teaches:
a control target vehicle determination unit configured to, based on the determined viewing area of the driver, determine a control target vehicle from among the plurality of control target candidate vehicles by selecting one of the plurality of control target vehicles candidates that is included in the determined viewing area of the driver; (“choosing a target vehicle based on information about a position of a preceding vehicle and information about a visual line of the driver,” [0010], “The control unit 300 may choose a target vehicle based on the information about the position of the other vehicle which is acquired by the preceding-vehicle detection unit 200 and the information about the visual line of the driver which is acquired by the driver visual line detection unit 100” [0089], “That is, as described in FIGS. 10 and 11, when the vehicle M of the driver enters the lane on which the other vehicle B is located, the control unit 300 may compare the information about the position of the other vehicle B and the information about the visual line P3 of the driver. At this time, when an angle of the other vehicle B detected by the preceding-vehicle detection unit 200 is matched with an angle of the visual line P3 of the driver which is detected by the driver visual line detection unit 100, the control unit 300 may choose the other vehicle B as the target vehicle. When the other vehicle B is chosen as the target vehicle, the control unit 300 may control the traveling speed of the vehicle M of the driver in order to control an inter-vehicle distance between the vehicle M of the driver and the target vehicle B.” [0151-0152]), where there are multiple vehicles, vehicles A, B, and C, present before a viewing direction of the driver is chosen and a target vehicle is chosen to follow based on the viewing direction of the driver. While the viewing direction in Lee can also be determined when there is at least one vehicle detected, the viewing direction can still be determined especially when a plurality of vehicles detected in order to choose one vehicle as a target vehicle.
a vehicle controller configured to control a speed of the host vehicle to follow the determined control target vehicle; (“a control unit configured to choose a target vehicle based on the information about the position of the at least one other vehicle which is acquired by the preceding-vehicle detection unit and the information about the visual line of the driver which is acquired by the driver visual line detection unit and control a traveling speed of the vehicle which the driver drives such that an inter-vehicle distance from the target vehicle becomes a preset distance.” [0012]).
With respect to claim 4, Lee in combination with Namba and Oyama, as shown in the rejection above, discloses the limitations of claim 2. The combination of Lee, Namba, and Oyama teaches a vehicle control apparatus for performing vehicle speed control of a host vehicle of claim 2. Lee does not teach, but Namba teaches:
wherein the candidate vehicle detection unit recognizes a plurality of lanes ahead of the branch point or the intersection when there is the branch point or the intersection in front of the host vehicle, and detects the plurality of control target candidate vehicles based on a recognition result of the plurality of lanes; Where Lee teaches (“The preceding-vehicle detection unit 200 may detect at least one other vehicle located ahead of the vehicle 1 which the driver drives to acquire information about a position of the detected vehicle. The other vehicle located ahead of the vehicle 1 which the driver drives may include a vehicle located ahead on the same lane, a vehicle entering a lane on which the vehicle of the driver is located from a neighboring lane, or a vehicle deviating from the lane on which the vehicle of the driver is located. The preceding-vehicle detection unit 200 may detect these plurality of vehicles.” [0081]), where the control target candidate vehicle is detected based on detecting a plurality of lanes, and Namba teaches the detection of a control target candidate vehicle while detecting an intersection, (“The automatic driving assistance system is configured to execute an adaptive cruise control (ACC) that causes the own vehicle to travel following a preceding vehicle. When detecting a preceding vehicle traveling ahead of the own vehicle on a target traveling course, the automatic driving assistance system activates the ACC to control the speed of the own vehicle to a predetermined speed on the basis of an intervehicle distance between the own vehicle and the preceding vehicle and a relative vehicle speed between the own vehicle and the preceding vehicle, for example. In addition, on the basis of data on an environment in front of the own vehicle obtained by the sensing device, such as a vehicle-mounted camera, the ACC recognizes a color of a signal indicated by a traffic light installed at an intersection. In a case where the traffic light indicates a green signal, the ACC causes the own vehicle to travel at an ACC set vehicle speed set by the driver within a speed limit of the road.” [0004], “The camera unit 21 may acquire the traveling-environment data on an environment in front of the own vehicle M and recognize two lane lines defining left and right of a lane along which the own vehicle M is traveling, a road feature, presence of a preceding vehicle, and presence of a traffic light, for example. The camera unit 21 may also determine a road curvature of a middle of the lane between the left and right lane lines, an intervehicle distance between the own vehicle M and the preceding vehicle, and a relative speed between the own vehicle M and the preceding vehicle, for example.” [0018]), which shows an example in which a control target candidate vehicle is detected when there is an intersection in front of the host vehicle.
It would have been obvious to one of ordinary skill in the art before the effective filling date of the instant application to have combined Lee’s candidate vehicle detection unit with Namba’s intersection detection because (“The circuitry is configured to detect, on the basis of a traveling environment in a predetermined range of a target traveling course in front of the vehicle, a set of intersections whose interval is shorter than a predetermined interval within the predetermined range of the target traveling course in front of the vehicle.” see Namba [0007]), therefore improving detection of the external environment for traveling an intersection.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Christine N Huynh whose telephone number is (571)272-9980. The examiner can normally be reached Monday - Friday 8 am - 4 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached at (571)270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTINE NGUYEN HUYNH/Examiner, Art Unit 3662
/ANISS CHAD/Supervisory Patent Examiner, Art Unit 3662