DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is made final.
Claims 1, 4, 6, 8, 10 and 11 are pending. Claims 1, 10, and 11 are independent. Claims 2, 3, 5, 7, and 9 have been canceled.
Priority
Acknowledgement is made of Applicant’s claim for foreign priority of Korean application KR10-2020-0036896 filed 03/26/2020.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 8, 10, and 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 2021/0078407 A1), in view of Caldwell et al. (US 2021/0056853 A1).
Regarding claim 1, Kim teaches a method of providing a travelling guide using position information of a vehicle and traffic light information (FIG. 5 and [0092-0095], FIGS. 10-11 and [0118-0120]), which is a method performed by a computing device (electronic device 101 of FIG. 2 and [0037-0053]), the method comprising:
collecting image data obtained by photographing a space in which a vehicle is currently travelling ([0033], second half of [0040], [0043], [0072], [0081], FIG. 5, and [0092-0095]: another device 103 may be a camera, which collects image data of a space in which a vehicle is currently travelling. Note FIG. 5 is an example, and other examples may correspond to this feature, including FIG. 7 and/or FIG. 11 and their corresponding paragraphs);
determining a current travelling lane of the vehicle using the image data and position information of the vehicle (second half of [0040], [0072], [0081], FIG. 5, and [0092-0095]: a current travelling lane is determined using the image data and position information of the vehicle. Note FIG. 5 is an example, and other examples may correspond to this feature, including FIG. 7 and/or FIG. 11 and their corresponding paragraphs);
providing a travelling guide for the vehicle using the current travelling lane of the vehicle (second half of [0040], [0072], [0081], FIG. 5, and [0092-0095]: travelling guide is provided for the vehicle using the current travelling lane. Note FIG. 5 is an example, and other examples may correspond to this feature, including FIG. 7 and/or FIG. 11 and their corresponding paragraphs), wherein the providing of the travelling guide for the vehicle includes:
determining whether a lane change is required and the number of required lane changes based on the current travelling lane of the vehicle and a travelling route preset in the vehicle (FIG. 5 and [0092-0095]: a lane change is determined to be required. The number of lane changes required is determined to be 1 in this example based on the current travelling lane and the travelling route 507 preset in the vehicle); and
when it is determined that the lane change is required, providing a travelling guide for guiding to change lanes by the determined number of required lane changes (FIG. 5 and [0092-0095]: a travelling guide/configuration information 504 is provided when it is determined that the lane change is required),
wherein the providing of the travelling guide for guiding to change lanes includes, when it is determined that the lane change is required, determining a lane change time point of the vehicle based on the preset travelling route, wherein the determined lane change time point is corrected based on whether there is vehicle congestion on the preset travelling route (FIG. 5 and [0092-0095], FIG. 7 and [0103-0106], [0116]: a lane change time point corresponds to time at “a point where a moving direction of the vehicle is required to be changed”, as stated in [0092], based on the preset travelling route 507. The determined lane change time point is corrected based on vehicle congestion caused by another vehicle 503, which causes the time point to be made sooner so that vehicle 501 is behind vehicle 503. Additionally, see FIG. 7 with determining a lane change time point, being corrected with respect to another vehicle 710); and
collecting signal information about one or more traffic lights located on a travelling route preset in the vehicle (FIGS. 10-11 and [0118-0119]),
wherein the providing of the travelling guide for the vehicle includes providing a travelling guide for guiding to start or stop the vehicle or guiding control of a speed of the vehicle based on at least one of the current travelling lane of the vehicle, the travelling route preset in the vehicle, and the signal information (FIGS. 10-11 and [0118-0120]: the travelling guide provided includes providing a traveling guide for guiding to start or stop a vehicle or guiding control of a speed of the vehicle based on at least one of the current travelling lane, the traveling route, and the signal information. For example, a red light from traffic light 1104 causes guidance for the vehicle to slow down and/or stop by making content 1102 red), and
wherein the providing of the travelling guide for the vehicle includes:
providing a first travelling guide for the vehicle based on at least one of the current travelling lane of the vehicle, a travelling route preset in the vehicle, and the signal information (second half of [0040], [0072], [0081], FIG. 5, and [0092-0095]: travelling guide for changing lane is provided for the vehicle using the current travelling lane and travelling route preset; Additionally, or alternatively, see FIG. 7 and [0103-0106]: first travelling guide may be an operation for a lane change associated with text “wait” is displayed);
determining whether the vehicle performs an operation corresponding to the first travelling guide by comparing the operation corresponding to the first travelling guide with an actual operation of the vehicle (FIG. 7 and [0103-0106], [0116]: for example, the vehicle is determined to not be performing an operation of a lane change, corresponding to the first travelling guide vehicle, in comparison to an actual operation of the vehicle); and
when it is determined that the vehicle has not performed an operation corresponding to the first travelling guide while exceeding a preset time after the providing of the first travelling guide, providing a second travelling guide for guiding to control the vehicle according to the first travelling guide (FIG. 7 and [0103-0106], [0116]: when it is determined that the vehicle has not performed the lane change corresponding to the first travelling guide while exceeding a preset time given the “wait” command, a second travelling guide for guiding is provided, this time being an operation for lane change associated with text “OK”),
wherein the determining of the current travelling lane of the vehicle includes:
determining a current position of the vehicle and a moving direction of the vehicle using the position information of the vehicle (second half of [0040], [0072], [0081], FIG. 5, and [0092-0095]: a current travelling lane is determined using a current position of the vehicle and a moving direction of the vehicle); and
analyzing a road on which the vehicle travels and surrounding geographical features included in the image data ([0033], [0040], [0043], [0072], [0081], and [0095]: surrounding geographical features included in the image data is used to determine the current travelling lane of the vehicle).
Kim does not explicitly teach using a pre-trained artificial intelligence (AI) model, wherein the pre-trained AI model is a model trained using learning data in which lanes are classified based on a relationship with a reference, and the reference is set for each of a plurality of pieces of the image data based on one of the geographical features included therein.
Caldwell teaches analyzing a road on which the vehicle travels and surrounding geographical features included in the image data using a pre-trained artificial intelligence (AI) model to determine the current travelling lane of the vehicle (FIG. 6 and [0041-0059]: individual lanes are classified as an established lane, an occupied lane, and/or an unoccupied lane; FIG. 7 and [0072-0075]: a machine learned model/AI model is pre-trained via training system 740. The trained one or more AI model outputs one or more classifications of lanes based on a signal indicative of a presence of a vehicle in a lane, as supported in [0074]. Thus, the pre-trained AI model is used to determine the current travelling lane of the vehicle), and
wherein the pre-trained AI model is a model trained using learning data in which lanes are classified based on a relationship with a reference, and the reference is set for each of a plurality of pieces of the image data based on one of the geographical features included therein (FIG. 6 and [0041-0059], FIG. 7 and [0072-0075]: based on a lane’s relationship with a reference, such as vehicle 600, the lanes are classified. As supported in [0058], “the perception system 722 can receive image data and can utilize one or more image processing algorithms to perform object detection, segmentation, and/or classification with respect to object(s) identified in the image data. In some examples, the perception system 722 can associate a bounding box (or otherwise an instance segmentation) with an identified object and can associate a confidence score associated with a classification of the identified object with the identified object.” The reference, in this case vehicle 600, is set for each of a plurality of pieces of the image data based on one of the geographical features, such as road lanes, included therein. As further supported in [0074], “For example, the training system 740 can receive training data indicative of classified lanes and associated signals. The training system 740 can utilize one or more machine learning algorithms to train one or more models to output one or more classifications based at least in part on a signal. As a result, machine learned models can receive a signal associated with a lane, as described above, and can output one or more classifications for a particular lane.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the analysis disclosed in Kim by incorporating the teachings of Caldwell and use a pre-trained artificial intelligence (AI) model, wherein the pre-trained AI model is a model trained using learning data in which lanes are classified based on a relationship with a reference, and the reference is set for each of a plurality of pieces of the image data based on one of the geographical features included therein. Doing so would assess the environment more accurately and robustly. By analyzing the road using such a pre-trained AI model, the environment of the vehicle, including lanes, can be efficiently classified as demonstrated in Caldwell. In the context of Kim’s teachings, such efficient lane classification would allow for more relevant and/or safer travelling guides to be provided to the user in the vehicle traversing a multilane environment. Moreover, classifying lanes based on a relationship with a reference may increase classification accuracy. By accounting for a reference as a variable in relation to the lanes, classification errors can be reduced.
Regarding claim 8, Kim in view of Caldwell teaches the method of claim 1. Kim further teaches wherein the collecting of the signal information about the one or more traffic lights includes:
collecting position information about the one or more traffic lights located on the preset travelling route from precision map data including lanes, lane line information, and traffic light information for each of a plurality of roads ([0033], second half of [0040], [0043], [0072], [0081], FIGS. 10-11 and [0118-0120]: precision map data includes lanes, lane line information and traffic light information for each of a plurality of roads);
collecting the image data every preset unit time from a point in time at which a distance between the vehicle and the one or more traffic lights is less than or equal to a reference distance using the position information about the one or more traffic lights and a current position of the vehicle (FIGS. 10-11 and [0118-0120]: As stated in [0118], “Referring to FIG. 11, based on identifying that a traffic light 1104 is located within a designated distance, the processor 220 may determine a distance from the traffic light 1104 (or a color (e.g., red, yellow, or green) of a signal indicated by the traffic light 1104).” The distance is within a reference distance between the vehicle and the one or more traffic lights; See, for camera details, [0033], second half of [0040], [0043], [0072], [0081]. As stated in [0072], “In some other embodiments, the sensor value may indicate traffic information such as information on a traffic signal or a sign. The traffic information may include, for example, signal information of a traffic light, location information of the traffic light, location information of a stop line, information on the existence and nonexistence of the shoulder of a road, location information of the shoulder of a road, information on the existence and nonexistence of a crosswalk, location information of the crosswalk, information on the existence and nonexistence of a school zone, and location information of the school zone.”); and
analyzing images for the one or more traffic lights included in the image data to collect the signal information about the one or more traffic lights ([0033], [0072], and FIGS. 10-11 and [0118-0120]: images for one or more traffic lights are analyzed to collect the signal information about the one or more traffic lights).
Regarding claim 10, the claim recites an apparatus comprising a memory and a processor (Kim, processor 220 and memory 230 of FIG. 2 and [0037-0053]), wherein:
the memory stores one or more instructions; and
the processor executes the one or more instructions stored in the memory and performs the method of claim 1 by executing the one or more instructions and is therefore rejected on the same premise.
Regarding claim 11, the claim recites a non-transitory recording medium readable by a computing device that is combined with a computing device and on which a computer program for performing the method of claim 1 (Kim, [0008], [0056-0057], and [0135]; FIG. 2 and [0037-0053]; [0056-0057]) and is therefore rejected on the same premise.
Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 2021/0078407 A1), in view of Caldwell et al. (US 2021/0056853 A1), in view of Ramirez et al. (US 2016/0171521 A1).
Regarding claim 4, Kim further teaches the method of claim 1, wherein the providing of the travelling guide for the vehicle includes, when it is determined that the lane change is required, determining a lane change time point of the vehicle based on the preset travelling route (FIG. 5 and [0092-0095], FIG. 7 and [0103-0106], [0116]: a lane change time point corresponds to time at “a point where a moving direction of the vehicle is required to be changed”, as stated in [0092], based on the preset travelling route 507. The determined lane change time point is corrected based on vehicle congestion caused by another vehicle 503, which causes the time point to be made sooner so that vehicle 501 is behind vehicle 503. Additionally, see FIG. 7 with determining a lane change time point, being corrected with respect to another vehicle 710).
Kim in view of Caldwell does not explicitly teach when the lane change is not performed at the determined lane change time point, changing the preset travelling route based on the current travelling lane of the vehicle.
Ramirez teaches when the lane change is not performed at the determined lane change time point, changing the preset travelling route based on the current travelling lane of the vehicle ([0182]: “This recalculation may be performed when a vehicle has been determined to have stopped, when the vehicle 1410 has modified the original route, such as by missing a turn, a user has entered a new destination and/or a new start location.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim in view of Caldwell by incorporating the teachings of Ramirez and have when the lane change is not performed at the determined lane change time point, changing the preset travelling route based on the current travelling lane of the vehicle. Doing so would offer greater flexibility, robustness, and convenience for the user as, if the user is too late in making a lane change or turn, the user may safely be rerouted. In this way, the user is not lost even if the user deviated from an original/preset travelling route. This would also preclude the user from attempting a dangerous maneuver to make the lane change or turn as the user can rely on the preset travelling route being updated.
Claim(s) 6 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 2021/0078407 A1), in view of Caldwell et al. (US 2021/0056853 A1), in view of Raghu et al. (US 20160176358 A1).
Regarding claim 6, Kim in view of Caldwell teaches the method of claim 1. Kim in view of Caldwell does not explicitly teach wherein the providing of the travelling guide for the vehicle includes:
when the current travelling lane of the vehicle is a left turn-only lane, providing a travelling guide generated by considering only a left turn signal in the signal information;
when the current travelling lane of the vehicle is a shared straight/left turn lane, providing a travelling guide generated by considering only a straight signal in the signal information or a travelling guide generated by considering only the left turn signal in the signal information according to the preset travelling route;
when the current travelling lane of the vehicle is a right turn-only lane, providing a travelling guide generated by considering only the preset route regardless of the signal information; and
when the current travelling lane of the vehicle is a shared straight/right turn lane, providing a travelling guide generated by considering only the straight signal according to the preset travelling route or a travelling guide generated by considering only the preset route regardless of the signal information.
Raghu teaches when the current travelling lane of the vehicle is a left turn-only lane, providing a travelling guide generated by considering only a left turn signal in the signal information ([0038], [0113], FIGS. 12A-B and [0131-0132], FIG. 3B and [0093], and FIGS. 5A-C and [0097-0100]: when the current lane is a left turn-only lane, a travelling guide is generated by considering only a left turn signal);
when the current travelling lane of the vehicle is a shared straight/left turn lane, providing a travelling guide generated by considering only a straight signal in the signal information or a travelling guide generated by considering only the left turn signal in the signal information according to the preset travelling route ([0038], [0113], FIGS. 12A-B and [0131-0132], FIG. 3B and [0093], and FIGS. 5A-C and [0097-0100]: when the current lane is a shared straight/left turn lane, a travelling guide is generated by considering only a straight signal or only the left signal);
when the current travelling lane of the vehicle is a right turn-only lane, providing a travelling guide generated by considering only the preset route regardless of the signal information ([0038], [0113], FIGS. 12A-B and [0131-0132], FIG. 3A and [0079-0092], and FIGS. 5A-C and [0097-0100]: when the current lane is a right turn-only lane, a travelling guide is generated by considering only a right turn signal); and
when the current travelling lane of the vehicle is a shared straight/right turn lane, providing a travelling guide generated by considering only the straight signal according to the preset travelling route or a travelling guide generated by considering only the preset route regardless of the signal information ([0038], [0113], FIGS. 12A-B and [0131-0132], FIG. 3A and [0079-0092], and FIGS. 5A-C and [0097-0100]: when the current lane is a shared straight/right turn lane, a travelling guide is generated by considering only a straight signal or a right turn signal).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kim in view of Caldwell by incorporating the teachings of Raghu and have wherein the providing of the travelling guide for the vehicle includes: when the current travelling lane of the vehicle is a left turn-only lane, providing a travelling guide generated by considering only a left turn signal in the signal information; when the current travelling lane of the vehicle is a shared straight/left turn lane, providing a travelling guide generated by considering only a straight signal in the signal information or a travelling guide generated by considering only the left turn signal in the signal information according to the preset travelling route; when the current travelling lane of the vehicle is a right turn-only lane, providing a travelling guide generated by considering only the preset route regardless of the signal information; and when the current travelling lane of the vehicle is a shared straight/right turn lane, providing a travelling guide generated by considering only the straight signal according to the preset travelling route or a travelling guide generated by considering only the preset route regardless of the signal information. Doing so would help the user stay on course of their intended route and prevent accidents as the user is guided with navigation information with respect to relevant signal information. In this way, the user is less likely to take the wrong direction that could cause less efficient travel and endanger the user’s safety.
Response to Arguments
Applicant’s arguments with respect to the claims have been considered but are moot because the new ground of rejection does not rely on the same anticipatory reference nor combination of references applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure, including:
US 11480962 B1: lane classification and using features of objects to characterize the drivable region, including distance between an object and a proximate driving lane
US 2020/0249685 A1: training a machine learning model to predict trajectory of a vehicle lane
US 2020/0192365 A1: determining occluded lane segments based on relationship between predicted location of an object and the vehicle’s location
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNY NGUYEN whose telephone number is (571)272-4980. The examiner can normally be reached M-Th 7AM to 5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KIEU D VU can be reached on (571)272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KENNY NGUYEN/Primary Examiner, Art Unit 2171