DETAILED CORRESPONDENCE
This final office action is in response to the Amendments filed on 05 February 2026, regarding application number 18/813,836.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Response to Amendment
Claims 1, 4, 6-12, 15 and 17-20 remain pending in the application, while claims 2-3, 5, 13-14 and 16 have been cancelled.
Applicant’s amendments to the Claims have overcome each and every 35 U.S.C. 112(b) rejection previously set forth in the non-final office action mailed 05 November 2025. Therefore, the rejections have been withdrawn.
Response to Arguments
Applicant’s arguments, see Pages 9-11, filed 05 February 2026, with respect to the rejections of claims 1, 4, 6-12, 15 and 17-20 under 35 U.S.C. § 101 have been fully considered but they are not persuasive.
Applicant has argued the following with respect to Prong One of Step 2A:
“First, amended claim 1, the independent method claim (and similarly independent claims 11 and 12), recites that the claimed operations are performed by a processor of an autonomous vehicle. The recited operations therefore require execution by a hardware component, i.e., the processor, and cannot practically be formed in the human mind. Therefore, claim 1 (and also independent claims 11 and 12) is not directed to a mental process.”
Examiner respectfully disagrees. The “processor” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer component being used as a tool to perform the abstract idea. The mere nominal recitation of the processor does not take the claim limitations out of the mental process grouping. Thus, the claim recites a mental process. See MPEP 2106.04(d)(I).
“Further, amended claim 1 (and similarly independent claims 11 and 12) additionally recites "displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message." (Emphasis added.) This step does not fall within mental processes. Rather, this step is implemented by the hardware configuration that displays a result of the risk level or outputs a warning sound or a message, i.e., by the processor and through a display unit equipped in the autonomous vehicle, and/or an output device configured to output a warning sound or a message.”
Examiner respectfully disagrees because the “displaying” step is recited at a high level of generality and amounts to mere data displaying, which is a form of insignificant extra-solution activity. See full analysis below.
Applicant has argued the following with respect to Prong Two of Step 2A:
“…Independent claims 1, 11, and 12 reflect how to achieve the technical improvement mentioned above in the functioning of controlling an autonomous vehicle. Specifically, the subject matter of independent claim 1 involves (and similarly independent claims 11 and 12) analyzing a driving pattern of the at least one target vehicle based on a driving status criterion comprising a first driving status criterion with respect to the longitudinal behavior and a second driving status criterion with respect to the lateral behavior, determining a risk level with respect to the at least one target vehicle based on a result of the analyzing, and displaying a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message, thereby enabling stable defensive driving, improving vehicle stability, and providing a caution or warning that reduces a risk of an accident...”
Examiner respectfully disagrees. The “displaying” step is recited at a high level of generality (i.e. as a general means of displaying the risk level from the mental analyzing step), and amounts to mere data displaying, which is a form of insignificant extra-solution activity. The claim does not recite any particulars of what the displaying includes nor how the displaying is executed. Additionally, the claim requires either displaying the result of the risk level or “outputting a warning sound or a message”. The “outputting” step is also recited at a high level of generality (i.e. as a general means of post-solution data gathering) and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The claim also does not necessarily link the outputted warning sound or message to the result of the risk level. See MPEP 2106.05(g).
Applicant additionally argued “Specifically, the subject matter of independent claims 1, 11, and 12 may determine a risk level by analyzing a driving pattern of a nearby vehicle, thereby stably performing defensive driving and improving driving stability.”, but the features are not yet recited in the claims. Examiner recommends to claim an explicit control step of performing defensive driving and/or improving driving stability.
Applicant has argued the following with respect to Step 2B:
“In claims 1, 11, and 12, the above additional elements amount to significantly more than the judicial exception itself. As discussed above, the limitations of claims 1, 11, and 12 show a technical improvement in existing technology….”
Examiner respectfully disagrees for at least the same reasons discussed above. Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step is considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field. The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function.
For at least the reasons discussed above and below, the claim rejections under 35 U.S.C. § 101 have been maintained.
Applicant’s arguments, see Pages 11-14, with respect to the rejections of the claims under 35 U.S.C. § 102 and 35 U.S.C. § 103 have been fully considered and are persuasive. Therefore, the rejections have been withdrawn. However, upon further consideration, a new ground(s) of rejection is made further in view of newly cited references Heilbron et al. (US 20230236037 A1) and Okuda (US 20150070159 A1). See full details below.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 4, 6-12, 15 and 17-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to mental processes without significantly more.
Regarding Claim 1
Claim 1 recites teaches a method of controlling an autonomous vehicle comprising a processor, the method comprising:
recognizing, by the processor, at least one vehicle driving around the autonomous vehicle within a preset reference range;
setting, by the processor, at least one target vehicle by assigning an identifier (ID) to at least one among the at least one vehicle;
analyzing, by the processor, a driving pattern of the at least one target vehicle, based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior;
determining, by the processor, a risk level with respect to the at least one target vehicle based on a result of the analyzing; and
displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message,
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior and a second driving status criterion with respect to the lateral behavior, and
wherein the second driving status criterion includes criteria related to:
when the target vehicle is located at a center of a lane but continues steering;
when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving; and
when the target vehicle changes steering from steering less than the curvature of the curved road on which the target vehicle is driving to steering greater than the curvature of the curved road.
Claim analysis via 2019 PEG
Step 1: Statutory Category – Yes
The claim recites a method including at least one step. The claim falls within one of the four statutory categories because the claim is to a process. See MPEP 2106.03.
Step 2A Prong One Evaluation: Judicial Exception – Yes – Mental processes
Claims are to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind.
The claim recites the limitations of “recognizing at least one vehicle…” “setting at least one target vehicle…” “analyzing a driving pattern…”, “determining a risk level…” and “wherein the driving status criterion comprises…” . These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses a driver recognizing a vehicle around them, assigning an ID to the vehicle, analyzing the driving performance of the vehicle and determining whether or not the vehicle imposes a risk to the driver. The driver can analyze the vehicle around them by determining if the vehicle swerves around the center of a lane and determining if the vehicle steers less than or greater than a curvature of a curved road. The mere nominal recitation of the processor does not take the claim limitations out of the mental process grouping. Thus, the claim recites a mental process.
Accordingly, the claim is directed to an abstract idea.
Step 2A Prong Two Evaluation: Practical Application - No
The claims are evaluated whether as a whole they integrate the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
The claim recites the additional element “…by the processor…”. The “processor” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer component being used as a tool to perform the abstract idea. See MPEP 2106.04(d)(I).
The claim recites the additional limitation “displaying … a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message”. The “displaying” step is recited at a high level of generality (i.e. as a general means of displaying the risk level from the mental analyzing step), and amounts to mere data displaying, which is a form of insignificant extra-solution activity. The claim does not recite any particulars of what the displaying includes nor how the displaying is executed. Additionally, the claim requires either displaying the result of the risk level or “outputting a warning sound or a message”. The “outputting” step is also recited at a high level of generality (i.e. as a general means of post-solution data gathering) and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The claim also does not necessarily link the outputted warning sound or message to the result of the risk level. See MPEP 2106.05(g).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Accordingly, the claim is directed to an abstract idea.
Step 2B Evaluation: Inventive concept - No
The claim(s) is evaluated whether the claim as a whole amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed with respect to Step 2A Prong Two, for the additional element in the claim in which the “processor” is merely a tool being used to perform the abstract idea, the same analysis applies here as above. Merely using a computer as a tool to perform an abstract idea cannot integrate a judicial exception into a practical application or provide an inventive concept.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step is considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field.
The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function.
Claim 1 is not patent eligible.
Regarding Claims 4 and 6-10
Claim 4 recites the method of claim 1,
wherein the first driving status criterion is set based on an absolute value change in at least one of a longitudinal position, speed, or acceleration of the at least one target vehicle and a relative value change in at least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle.
Claim 6 recites the method of claim 1,
further comprising:
in response to a departure of the directional behavior of the at least one target vehicle from the first driving status criterion or the second driving status criterion, calculating, by the processor, a risk score;
accumulating, by the processor, the calculated risk score; and
determining, by the processor, the risk level based on the accumulated risk score and a plurality of preset reference levels.
Claim 7 recites the method of claim 6,
further comprising:
applying, by the processor, a weight to the risk score calculated correspondingly to the departure from the second driving status criterion.
Claim 8 recites the method of claim 6,
further comprising:
in response to determination that the risk level is a danger level among the plurality of preset reference levels, recognizing, by the processor, a license plate of the at least one target vehicle and storing the recognized license plate; and
in response to re-detection of the at least one target vehicle after departing from the preset reference range, setting, by the processor, a previous danger level as a current danger level for the re-detected target vehicle.
Claim 9 recites the method of claim 6,
further comprising:
in response to a departure of the at least one target vehicle from a preset safety speed or a preset safety distance, setting, by the processor, the at least one target vehicle as a danger level.
Claim 10 recites the method of claim 1,
wherein:
the recognizing of the at least one vehicle includes recognizing a plurality of vehicles driving around the autonomous vehicle within a preset reference range;
the setting of the at least one target vehicle includes setting a plurality of target vehicles by assigning an ID to each of the plurality of vehicles;
the analyzing of the driving pattern includes analyzing a driving pattern of each of the plurality of target vehicles; and
the determining of the risk level includes determining a risk level with respect to each of the plurality of target vehicles based on a result of the analyzing of the corresponding driving pattern.
Claim analysis via 2019 PEG
Step 1: Statutory category – Yes
The claims recite a method including at least one step. The claims fall within one of the four statutory categories because the claims are to a process. See MPEP 2106.03.
Step 2A Prong One Evaluation: Judicial Exception – Yes – Mental processes
Claims are to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claims cover performance of the limitation in the human mind.
Regarding claim 4, the claim recites the limitation of “wherein the first driving status criterion is set based on an absolute value change in at least one of a longitudinal position, speed, or acceleration…”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim elements precludes the step from practically being performed in the mind. For example, the claim encompasses the driver performing the above mental process, by further analyzing the driving pattern based on a change in position, speed or acceleration of the vehicle. Thus, the claim recites a mental process.
Regarding claim 6, the claim recites the limitations of “…calculating a risk score”, “accumulating the calculated risk score” and “determining the risk level…”. These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further calculating a risk score and risk level based on how much the vehicle is deviating from an expected longitudinal or lateral position. Thus, the claim recites a mental process.
Regarding claim 7, the claim recites the limitation of “applying a weight to the risk score”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further applying a weight to the risk score calculated correspondingly to the departure from the second driving status criterion. Thus, the claim recites a mental process.
Regarding claim 8, the claim recites the limitations of “recognizing a license plate…” and “…re-detection of the at least one target vehicle…”. These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further recognizing a license plate of the vehicle and seeing the vehicle again at a later time. Thus, the claim recites a mental process.
Regarding claim 9, the claim recites the limitation of “…setting the at least one target vehicle as a danger level”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further setting the vehicle as a danger level when it departs from a preset safety speed or distance. Thus, the claim recites a mental process.
Regarding claim 10, the claim recites the limitations of “recognizing…” “setting…” “analyzing…” and “determining… These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim elements precludes the step from practically being performed in the mind. For example, the claim encompasses the driver performing the above mental process with a plurality of vehicles. Thus, the claim recites a mental process.
Accordingly, the claims are directed to an abstract idea.
Step 2A Prong Two Evaluation: Practical Application - No
The claims are evaluated whether as a whole they integrate the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
Claims 4 and 10 do not recite any additional elements.
Claims 6-9 recite the additional element of “…by the processor…”.
Claim 8 recites the additional step of “storing the recognized license plate”.
The “processor” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer being used as a tool to perform the abstract idea. See MPEP 2106.04(d)(I).
The storing the license plate step is recited at a high level of generality (i.e. as a general means of gathering license plate data) and amounts to mere data gathering, which is a form of insignificant extra-solution activity.
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Accordingly, the claims are directed to an abstract idea.
Step 2B Evaluation: Inventive concept - No
The claim(s) is evaluated whether the claim as a whole amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed with respect to Step 2A Prong Two, for the additional elements in the claims in which the “processor” is merely a tool being used to perform the abstract idea, the same analysis applies here as above. Merely using a computer as a tool to perform an abstract idea cannot integrate a judicial exception into a practical application or provide an inventive concept.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step is considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field.
The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here).
Claims 4 and 6-10 are not patent eligible.
Regarding Claim 11
Claim 11 recites a non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform:
recognizing at least one vehicle driving around an autonomous vehicle within a preset reference range;
setting at least one target vehicle by assigning an identifier (ID) to at least one among the at least one vehicle;
analyzing a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior;
determining a risk level with respect to the at least one target vehicle based on a result of the analyzing; and
displaying a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message,
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior and a second driving status criterion with respect to the lateral behavior, and
wherein the second driving status criterion includes criteria related to:
when the target vehicle is located at a center of a lane but continues steering;
when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving; and
when the target vehicle changes steering from steering less than the curvature of the curved road on which the target vehicle is driving to steering greater than the curvature of the curved road.
Claim analysis via 2019 PEG
Step 1: Statutory Category – Yes
The claim recites a non-transitory computer-readable storage medium. Thus, the claim falls within one of the four statutory categories because the claim is to a manufacture/machine. See MPEP 2106.03.
Step 2A Prong One Evaluation: Judicial Exception – Yes – Mental processes
Claims are to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind.
The claim recites the limitations of “recognizing at least one vehicle…” “setting at least one target vehicle…” “analyzing a driving pattern…”, “determining a risk level…” and “wherein the driving status criterion comprises…” . These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses a driver recognizing a vehicle around them, assigning an ID to the vehicle, analyzing the driving performance of the vehicle and determining whether or not the vehicle imposes a risk to the driver. The driver can analyze the vehicle around them by determining if the vehicle swerves around the center of a lane and determining if the vehicle steers less than or greater than a curvature of a curved road. The mere nominal recitation of the processor does not take the claim limitations out of the mental process grouping. Thus, the claim recites a mental process.
Accordingly, the claim is directed to an abstract idea.
Step 2A Prong Two Evaluation: Practical Application - No
The claims are evaluated whether as a whole they integrate the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
The claim recites the additional element “…cause the processor to perform…” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer component being used as a tool to perform the abstract idea. See MPEP 2106.04(d)(I).
The claim recites the additional limitation “displaying a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message”. The “displaying” step is recited at a high level of generality (i.e. as a general means of displaying the risk level from the mental analyzing step), and amounts to mere data displaying, which is a form of insignificant extra-solution activity. The claim does not recite any particulars of what the displaying includes nor how the displaying is executed. Additionally, the claim requires either displaying the result of the risk level or “outputting a warning sound or a message”. The “outputting” step is also recited at a high level of generality (i.e. as a general means of post-solution data gathering) and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The claim also does not necessarily link the outputted warning sound or message to the result of the risk level. See MPEP 2106.05(g).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Accordingly, the claim is directed to an abstract idea.
Step 2B Evaluation: Inventive concept - No
The claim(s) is evaluated whether the claim as a whole amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed with respect to Step 2A Prong Two, for the additional element in the claim in which the “processor” is merely a tool being used to perform the abstract idea, the same analysis applies here as above. Merely using a computer as a tool to perform an abstract idea cannot integrate a judicial exception into a practical application or provide an inventive concept.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step is considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field.
The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function.
Claim 11 is not patent eligible.
Regarding Claim 12
Claim 12 recites an autonomous vehicle comprising a processor, wherein the processor is configured to:
recognize at least one vehicle driving around the autonomous vehicle within a preset reference range;
set at least one target vehicle by assigning an identifier (ID) to at least one among the at least one vehicle;
analyze a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior;
determine a risk level with respect to the at least one target vehicle based on a result of the analyzing; and
display a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message,
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior and a second driving status criterion with respect to the lateral behavior, and
wherein the second driving status criterion includes criteria related to:
when the target vehicle is located at a center of a lane but continues steering;
when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving; and
when the target vehicle changes steering from steering less than the curvature of the curved road on which the target vehicle is driving to steering greater than the curvature of the curved road.
Claim analysis via 2019 PEG
Step 1: Statutory Category – Yes
The claim recites a processor. Thus, the claim falls within one of the four statutory categories because the claim is to a manufacture. See MPEP 2106.03.
Step 2A Prong One Evaluation: Judicial Exception – Yes – Mental processes
Claims are to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind.
The claim recites the limitations of “recognize at least one vehicle…” “set at least one target vehicle…” “analyze a driving pattern…”, “determine a risk level…” and “wherein the driving status criterion comprises…” . These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses a driver recognizing a vehicle around them, assigning an ID to the vehicle, analyzing the driving performance of the vehicle and determining whether or not the vehicle imposes a risk to the driver. The driver can analyze the vehicle around them by determining if the vehicle swerves around the center of a lane and determining if the vehicle steers less than or greater than a curvature of a curved road. The mere nominal recitation of the processor does not take the claim limitations out of the mental process grouping. Thus, the claim recites a mental process.
Accordingly, the claim is directed to an abstract idea.
Step 2A Prong Two Evaluation: Practical Application - No
The claims are evaluated whether as a whole they integrate the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
The claim recites the additional element “…wherein the processor is configured to…”. The “processor” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer component being used as a tool to perform the abstract idea. See MPEP 2106.04(d)(I).
The claim recites the additional limitation “display a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message”. The “display” step is recited at a high level of generality (i.e. as a general means of displaying the risk level from the mental analyzing step), and amounts to mere data displaying, which is a form of insignificant extra-solution activity. The claim does not recite any particulars of what the displaying includes nor how the displaying is executed. Additionally, the claim requires either displaying the result of the risk level or “outputting a warning sound or a message”. The “outputting” step is also recited at a high level of generality (i.e. as a general means of post-solution data gathering) and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The claim also does not necessarily link the outputted warning sound or message to the result of the risk level. See MPEP 2106.05(g).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Step 2B Evaluation: Inventive concept - No
The claim(s) is evaluated whether the claim as a whole amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
The claim does not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed with respect to Step 2A Prong Two, for the additional element in the claim in which the “processor” is merely a tool being used to perform the abstract idea, the same analysis applies here as above. Merely using a computer as a tool to perform an abstract idea cannot integrate a judicial exception into a practical application or provide an inventive concept.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step is considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field.
The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function.
Claim 12 is not patent eligible.
Regarding Claims 15 and 17-20
Claim 15 recites the autonomous vehicle of claim 12,
wherein the first driving status criterion is set based on an absolute value change in at least one of a longitudinal position, speed, or acceleration of the at least one target vehicle and a relative value change in at least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle.
Claim 17 recites the autonomous vehicle of claim 12,
wherein the processor is further configured to:
in response to a departure of the directional behavior of the at least one target vehicle from the first driving status criterion or the second driving status criterion, calculate a risk score;
accumulate the calculated risk score; and
determine the risk level based on the accumulated risk score and a plurality of preset reference levels.
Claim 18 recites the autonomous vehicle of claim 17,
wherein the processor is further configured to:
apply a weight to the risk score calculated correspondingly to the departure from the second driving status criterion.
Claim 19 recites the autonomous vehicle of claim 17,
wherein the processor is further configured to:
in response to determination that the risk level is a danger level among the plurality of preset reference levels, recognize a license plate of the at least one target vehicle and store the recognized license plate; and
in response to re-detection of the at least one target vehicle after departing from the preset reference range, set a previous danger level as a current danger level for the re-detected target vehicle.
Claim 20 recites the autonomous vehicle of claim 17,
wherein the processor is further configured to:
in response to a departure of the at least one target vehicle from a preset safety speed or a preset safety distance, set the at least one target vehicle as a danger level.
Claim analysis via 2019 PEG
Step 1: Statutory category – Yes
The claims recite a processor. Thus, the claims fall within one of the four statutory categories because the claims are to a manufacture. See MPEP 2106.03.
Step 2A Prong One Evaluation: Judicial Exception – Yes – Mental processes
Claims are to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the claims cover performance of the limitation in the human mind.
Regarding claim 15, the claim recites the limitation of “wherein the first driving status criterion is set based on an absolute value change in at least one of a longitudinal position, speed, or acceleration…”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind. That is, nothing in the claim elements precludes the step from practically being performed in the mind. For example, the claim encompasses the driver performing the above mental process, by further analyzing the driving pattern based on a change in position, speed or acceleration of the vehicle. Thus, the claim recites a mental process.
Regarding claim 17, the claim recites the limitations of “…calculate a risk score”, “accumulate the calculated risk score” and “determine the risk level…”. These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further calculating a risk score and risk level based on how much the vehicle is deviating from an expected longitudinal or lateral position. Thus, the claim recites a mental process.
Regarding claim 18, the claim recites the limitation of “apply a weight to the risk score”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further applying a weight to the risk score calculated correspondingly to the departure from the second driving status criterion. Thus, the claim recites a mental process.
Regarding claim 19, the claim recites the limitations of “recognize a license plate…” and “…re-detection of the at least one target vehicle…”. These limitations, as drafted, are a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further recognizing a license plate of the vehicle and seeing the vehicle again at a later time. Thus, the claim recites a mental process.
Regarding claim 20, the claim recites the limitation of “…set the at least one target vehicle as a danger level”. This limitation, as drafted, is a process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of the generic computer component “processor”. That is, other than reciting the “processor”, nothing in the claim elements precludes the step from practically being performed in the mind. For example, but for the generic computer language, the claim encompasses the driver performing the above mental process, by further setting the vehicle as a danger level when it departs from a preset safety speed or distance. Thus, the claim recites a mental process.
Accordingly, the claims are directed to an abstract idea.
Step 2A Prong Two Evaluation: Practical Application - No
The claims are evaluated whether as a whole they integrate the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”).
Claims 15 does not recite any additional elements.
Claims 17-20 recite the additional element of “…wherein the processor is further configured to…”.
Claim 19 recites the additional step of “store the recognized license plate”.
The “processor” does not integrate the abstract idea into a practical application because it is described at high level of generality and is merely a computer being used as a tool to perform the abstract idea. See MPEP 2106.04(d)(I).
The store the license plate step is recited at a high level of generality (i.e. as a general means of gathering license plate data) and amounts to mere data gathering, which is a form of insignificant extra-solution activity.
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea.
Accordingly, the claims are directed to an abstract idea.
Step 2B Evaluation: Inventive concept - No
The claim(s) is evaluated whether the claim as a whole amounts to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed with respect to Step 2A Prong Two, for the additional elements in the claims in which the “processor” is merely a tool being used to perform the abstract idea, the same analysis applies here as above. Merely using a computer as a tool to perform an abstract idea cannot integrate a judicial exception into a practical application or provide an inventive concept.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B. Here, the data gathering step was considered to be insignificant extra-solution activity in Step 2A, and thus it is re-evaluated in Step 2B to determine if it is more than what is well-understood, routine, conventional activity in the field.
The background recites that the processor is a conventional processor and does not provide any indication it could be anything other than a conventional computer equipment. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here).
Claims 15 and 17-20 are not patent eligible.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 6-7, 9-12, 17-18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Hyun (US 20180061253 A1 and Hyun hereinafter), as modified by Choi et al. (US 20230052137 A1 and Choi hereinafter), Heilbron et al. (US 20230236037 A1 and Heilbron hereinafter) and Okuda (US 20150070159 A1 and Okuda hereinafter).
Regarding Claims 1, 11 and 12
Regarding claim 1, Hyun teaches a method of controlling an autonomous vehicle comprising a processor (see all Figs.; [0002] and [0005]), the method comprising:
recognizing, by the processor, at least one vehicle driving around the autonomous vehicle within a preset reference range (see Figs. 1A-1B and 3A-3B, all, especially "limited range 304" in Fig. 3A; [0005], [0054 "Referring to a scenario 100 of FIG. 1A, the host vehicle 101 is performing autonomous driving using the autonomous driving apparatus 10, and the autonomous driving apparatus 10 senses vehicles 102 and 103 near the host vehicle 101."]-[0055] and [0080 "Referring to a scenario 310 of FIG. 3A, the autonomous driving apparatus 10 a of a host vehicle 301 measures risks of nearby vehicles 302 and 303 within a range 304 defined based on a location of the host vehicle 301."]);
setting, by the processor, at least one target vehicle by assigning an identifier (ID) to at least one among the at least one vehicle (see [0018 "The autonomous driving method may further include: generating an identifier of the target vehicle based on the appearance characteristic of the target vehicle ... The identifier may include any one or any combination of any two or more of a license plate, a type, and a color of the target vehicle."], [0056 "The autonomous driving apparatus 10 identifies each of the nearby vehicles 102 and 103 based on the data collected from the sensor or camera 12"], [0083] and [0092 "Referring to a scenario 600 of FIG. 6A, an autonomous driving apparatus 10 c of a host vehicle 601 determines a risk of a target vehicle 602, and stores the risk to be associated with an identifier A of the target vehicle 602."]);
analyzing, by the processor, a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior (see [0005 "...determining a risk of a target vehicle based on either one or both of a driving characteristic..."]-[0010], [0057 "For example, the autonomous driving apparatus 10 determines, based on predefined criteria associated with the nearby vehicle 102/103, a case in which an average speed of the nearby vehicle 102/103 during a predefined time period is relatively high, sudden acceleration is performed a number of times by the nearby vehicle 102/103, or sudden deceleration is performed a number of times by the nearby vehicle 102/103, a case in which a distance between the nearby vehicle 102/103 and a vehicle ahead of the nearby vehicle 102/103 is relatively short, a case in which a number of lane changes are performed by the nearby vehicle 102/103..."] and [0061]-[0067],);
determining, by the processor, a risk level with respect to the at least one target vehicle based on a result of the analyzing (see [0005 "...determining a risk of a target vehicle based on either one or both of a driving characteristic..."]-[0010], [0057 "The autonomous driving apparatus 10 determines a risk of (e.g., a risk or danger presented by) the nearby vehicle 102/103 based on the appearance characteristic of the nearby vehicle 102/103, the driving characteristic of the nearby vehicle 102/103 ... a case in which a production year of the nearby vehicle 102/103 is long time prior to the current date, and calculates the risk of the nearby vehicle 102/103 by applying predefined weights to determination results."], [0061]-[0067], [0076 "The risk of the target vehicle is measured at points, for example, at a real number value out of a possible 1 point or at a discrete level."], [0083] and [0092]-[0095]), and
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior (see [0007 "The characteristic associated with the speed of the target vehicle may include any one or any combination of any two or more of the speed of the target vehicle, a speed of the target vehicle relative to the host vehicle, and a difference between the speed of the target vehicle and an average speed of a vehicle near the target vehicle."], [0008 "The driving characteristic may include variances in a speed of the target vehicle."], [0009 "The driving characteristic may include a distance between the target vehicle and a vehicle ahead of the target vehicle."], [0055]-[0057] and [0061]-[0065]) and a second driving status criterion with respect to the lateral behavior (see [0010 "The driving characteristic may include a number of lane changes performed by the target vehicle during a defined time period"], [0055]-[0057] and [0066]-[0067 "The driving characteristic of the target vehicle includes, for example, a lane-keeping time of the target vehicle during a defined or predefined time period."]).
Regarding claim 11, Hyun additionally teaches a non-transitory computer-readable storage medium storing instructions that (see Fig. 9, all; [0005] and [0020]), when executed by a processor, cause the processor to perform the above process (as discussed above).
Regarding claim 12, Hyun additionally teaches an autonomous vehicle comprising a processor (see Fig. 9, all; [0005] and [0020]), wherein the processor is configured to the above process (as discussed above).
Hyun is silent regarding displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message,
wherein the second driving status criterion includes criteria related to:
when the target vehicle is located at a center of a lane but continues steering;
when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving; and
when the target vehicle changes steering from steering less than the curvature of the curved road on which the target vehicle is driving to steering greater than the curvature of the curved road.
Choi teaches a method of controlling an autonomous vehicle comprising a processor (see Fig. 1, all; [0005]), the method comprising:
recognizing, by the processor, at least one vehicle driving around the autonomous vehicle within a preset reference range (see [0005 "The method includes the steps of, gathering external sensor information on an external region surrounding the host vehicle; analyzing the external sensor information to detect a target vehicle traveling in a lane and a movement of the target vehicle in the lane within the external region..."], [0009]-[0011] and [0020]);
analyzing, by the processor, a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior (see [0005 "...detect a target vehicle traveling in a lane and a movement of the target vehicle in the lane within the external region; determining whether the movement of the target vehicle in the lane is erratic..."], [0009 "...determining the target vehicle is an erratic vehicle when the movement of the target vehicle includes at least one of: (i) oscillating about a centerline of the lane above a predetermine frequency, (ii) driving beyond a predetermined offset from the centerline of the lane for greater than a predetermined length of time, (iii) fluctuating in speed relative to the average speed above a predetermined standard variation, and (iv) changing lanes beyond a predetermined frequency..."], [0026 "The EVDAM system determines if a target vehicle 104 a, 104 b, 104 c is driving erratically and poses a potential threat or risk to the host vehicle 102 by using statistics of relative lateral position in the y-axis, longitudinal speed in the x-axis, and lane change frequency of the target vehicle 104 a, 104 b, 104 c."] and [0058]-[0061]);
determining, by the processor, a risk level with respect to the at least one target vehicle based on a result of the analyzing (see [0005 "...detect a target vehicle traveling in a lane and a movement of the target vehicle in the lane within the external region; determining whether the movement of the target vehicle in the lane is erratic..."], [0026], [0039]-[0040] and [0062]-[0069]), and
displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message (see [0005] and [0070 "Moving to block 308, if the risk score is relatively low, for example 25%, the mitigating action may include the EVDAM module 224 communicating with the ADAS module 222 to initiate a tactile, visual, or audible warning to the operator. An example of a tactile warning may include vibrating a feature that is normally in contact with the operator such as the steering wheel or operator seat. An example of the visual warning may include a visual indicator such as a light on a dash, a human machine interface such as the infotainment system, or onboard display. An example of an audible warning may be the sounding of the horn in short and/or long bursts or providing an audible warning over the internal speakers of the infotainment system."]),
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior (see [0009 "...determining the target vehicle is an erratic vehicle when the movement of the target vehicle includes at least one of: ... (iii) fluctuating in speed relative to the average speed above a predetermined standard variation..."], [0026 "The EVDAM system determines if a target vehicle 104 a, 104 b, 104 c is driving erratically and poses a potential threat or risk to the host vehicle 102 by using statistics of relative lateral position in the y-axis, longitudinal speed in the x-axis, and lane change frequency of the target vehicle 104 a, 104 b, 104 c."] and [0060]) and a second driving status criterion with respect to the lateral behavior (see [0009 "...determining the target vehicle is an erratic vehicle when the movement of the target vehicle includes at least one of: (i) oscillating about a centerline of the lane above a predetermine frequency, (ii) driving beyond a predetermined offset from the centerline of the lane for greater than a predetermined length of time ... and (iv) changing lanes beyond a predetermined frequency..."], [0026 "The EVDAM system determines if a target vehicle 104 a, 104 b, 104 c is driving erratically and poses a potential threat or risk to the host vehicle 102 by using statistics of relative lateral position in the y-axis, longitudinal speed in the x-axis, and lane change frequency of the target vehicle 104 a, 104 b, 104 c."], [0059] and [0061]), and
wherein the second driving status criterion includes criteria related to:
when the target vehicle is located at a center of a lane but continues steering (see [0009], [0026 "For example, when the EVDAM system determines that: (i) a target vehicle 104 b is oscillating over the centerline 109 of the lane 108 c above a predetermine frequency or the target vehicle 104 b drives above a predetermined un-reasonably high offset from the centerline 109 ... then the EVDAM will assign an “erratic vehicle” status to the target vehicle 104 b."], [0038] and [0059 "Condition B is true when the centerline of the target vehicle oscillates over the centerline of the roadway beyond a predetermined lateral distance, for example 50 cm, and over above a predetermined frequency, such as 0.5 hz."]).
Heilbron teaches a method of controlling an autonomous vehicle comprising a processor (see all Figs.; [0005] and [0169]-[0172]), the method comprising:
recognizing, by the processor, at least one vehicle driving around the autonomous vehicle within a preset reference range (see [0169 "FIG. 5F is a flowchart showing an exemplary process 500F for determining whether a leading vehicle is changing lanes, consistent with the disclosed embodiments. At step 580, processing unit 110 may determine navigation information associated with a leading vehicle (e.g., a vehicle traveling ahead of vehicle 200). For example, processing unit 110 may determine the position, velocity (e.g., direction and speed), and/or acceleration of the leading vehicle, using the techniques described in connection with FIGS. 5A and 5B, above."]-[0172]); and
analyzing, by the processor, a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior (see [0169 "FIG. 5F is a flowchart showing an exemplary process 500F for determining whether a leading vehicle is changing lanes, consistent with the disclosed embodiments. At step 580, processing unit 110 may determine navigation information associated with a leading vehicle (e.g., a vehicle traveling ahead of vehicle 200). For example, processing unit 110 may determine the position, velocity (e.g., direction and speed), and/or acceleration of the leading vehicle, using the techniques described in connection with FIGS. 5A and 5B, above."]-[0172]);
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior and a second driving status criterion with respect to the lateral behavior (see [0169 "FIG. 5F is a flowchart showing an exemplary process 500F for determining whether a leading vehicle is changing lanes, consistent with the disclosed embodiments. At step 580, processing unit 110 may determine navigation information associated with a leading vehicle (e.g., a vehicle traveling ahead of vehicle 200). For example, processing unit 110 may determine the position, velocity (e.g., direction and speed), and/or acceleration of the leading vehicle, using the techniques described in connection with FIGS. 5A and 5B, above."]-[0172]), and
wherein the second driving status criterion includes criteria related to:
when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving (see Fig. 5F, all; [0169]-[0172], especially [0172 "Processing unit 110 may additionally compare the curvature of the snail trail (associated with the leading vehicle) with the expected curvature of the road segment in which the leading vehicle is traveling. The expected curvature may be extracted from map data (e.g., data from map database 160), from road polynomials, from other vehicles’ snail trails, from prior knowledge about the road, and the like. If the difference in curvature of the snail trail and the expected curvature of the road segment exceeds a predetermined threshold, processing unit 110 may determine that the leading vehicle is likely changing lanes."]).
Okuda teaches a method of controlling an autonomous vehicle comprising a processor (see all Figs.; [0005]-[0007]), the method comprising:
recognizing, by the processor, at least one vehicle (see Fig. 2, all; [0007 "The low-level consciousness determination system includes curved road determination means for determining whether the vehicle is traveling on the outside of a curved road or on the inside of the curved road when a road on which the vehicle is traveling is a curved road..."]);
analyzing, by the processor, a driving pattern of the at least one vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one vehicle is a lateral behavior (see Figs. 2-4, all; [0007 "The low-level consciousness determination system includes curved road determination means for determining whether the vehicle is traveling on the outside of a curved road or on the inside of the curved road when a road on which the vehicle is traveling is a curved road, threshold value setting means for setting different sudden steering determination threshold values when the curved road determination means determines that the vehicle is traveling on the outside of the curved road and when the curved road determination means determines that the vehicle is traveling on the inside of the curved road, and sudden steering detection means for detecting the sudden steering on the basis of the threshold values set by the threshold value setting means after the non-steering state is detected."]-[0009] and [0043]-[0050]);
determining, by the processor, a risk level with respect to the at least one vehicle based on a result of the analyzing (see Fig. 2, steps S13-S14; [0007]-[0010] and [0048]-[0050 "Then, the ECU 30 determines whether there is an abnormal behavior (sudden steering after the non-steering state) using the steering angular speed (S14). At that time, the sudden steering determination threshold value which is set in S13 is used to determine sudden steering."]), and
displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message (see [0032 "The warning device 20 is a device which outputs a warning for calling the driver's attention to the consciousness-degraded state to the driver. As a method for issuing the warning, for example, the following methods are used: a warning sound is output from a speaker; a warning screen is displayed on a display of a navigation system; a warning lamp in a combination meter is turned on; a seat vibration generation device vibrates a seat; and a steering wheel vibration generation device vibrates the steering wheel. When receiving a warning output signal from the ECU 30, the warning device 20 outputs a warning in response to the warning output signal."] and [0050]),
wherein the driving status criterion comprises a second driving status criterion with respect to the lateral behavior (see Figs. 2-4, all; [0007 "The low-level consciousness determination system includes curved road determination means for determining whether the vehicle is traveling on the outside of a curved road or on the inside of the curved road when a road on which the vehicle is traveling is a curved road, threshold value setting means for setting different sudden steering determination threshold values when the curved road determination means determines that the vehicle is traveling on the outside of the curved road and when the curved road determination means determines that the vehicle is traveling on the inside of the curved road, and sudden steering detection means for detecting the sudden steering on the basis of the threshold values set by the threshold value setting means after the non-steering state is detected."]-[0009] and [0043]-[0050]), and
wherein the second driving status criterion includes criteria related to:
when the vehicle changes steering from steering less than the curvature of the curved road on which the vehicle is driving to steering greater than the curvature of the curved road (see Figs. 2-4, all, especially Fig. 2, step S14; [0007 "...threshold value setting means for setting different sudden steering determination threshold values when the curved road determination means determines that the vehicle is traveling on the outside of the curved road and when the curved road determination means determines that the vehicle is traveling on the inside of the curved road, and sudden steering detection means for detecting the sudden steering on the basis of the threshold values set by the threshold value setting means after the non-steering state is detected."]-[0008], [0017], [0026], [0039 "For example, it is assumed that TH_sv1 is proportional to the curvature of the lane from TH_def..."]-[0040 "For example, this method determines whether a state in which the steering angular speed is equal to or less than a non-steering determination threshold value is maintained for a time that is equal to or greater than a non-steering duration determination threshold value and then determines whether the steering angular speed is equal to or greater than the sudden steering determination threshold value after this condition is satisfied."] and [0044]-[0050 "Then, the ECU 30 determines whether there is an abnormal behavior (sudden steering after the non-steering state) using the steering angular speed (S14). At that time, the sudden steering determination threshold value which is set in S13 is used to determine sudden steering."]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/non-transitory computer-readable medium/autonomous vehicle of modified Hyun to include a second driving status criterion related to when the target vehicle is located at a center of a lane but continues steering and to display a result of the risk level through a display unit equipped in the autonomous vehicle or output a warning sound or a message, as taught by Choi, in order to detect the target vehicle as engaging in an erratic behavior when it oscillates over the center of the lane at an un-reasonably high offset above a predetermined frequency and to warn a driver of the erratic behavior.
It would further have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/non-transitory computer-readable medium/autonomous vehicle of modified Hyun to include a second driving status criterion related to when the target vehicle turns by steering greater than a curvature of a straight road or a curved road on which the target vehicle is driving, as taught by Heilbron, in order to detect when the target vehicle is changing lanes.
It would further have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/non-transitory computer-readable medium/autonomous vehicle of modified Hyun to include a second driving status criterion related to when the target vehicle changes steering from steering less than the curvature of the curved road on which the target vehicle is driving to steering greater than the curvature of the curved road, as taught by Okuda, in order to detect sudden steering of the target vehicle, indicating a low-level consciousness of the driver.
Regarding Claims 6 and 17
Modified Hyun teaches the method of claim 1 and the autonomous vehicle of claim 12 (as discussed above in claims 1 and 12),
Hyun further teaches further comprising:
in response to a departure of the directional behavior of the at least one target vehicle from the first driving status criterion or the second driving status criterion, calculating, by the processor, a risk score (see [0007 "The determining of the risk of the target vehicle may include determining the risk of the target vehicle based on whether any one or any combination of any two or more of the speed of the target vehicle, the speed of the target vehicle relative to the host vehicle, and the difference between the speed of the target vehicle and the average speed of the vehicle near the target vehicle exceeds a threshold."], [0008 "The determining of the risk of the target vehicle may include determining the risk of the target vehicle based on whether a number of times that the variances exceed a threshold variance during a defined time period exceeds a defined value."], [0009 "The determining of the risk of the target vehicle may include determining the risk of the target vehicle based on whether the distance is less than a threshold distance."], [0010 "The determining of the risk of the target vehicle may include determining the risk of the target vehicle based on whether the number of lane changes exceeds a threshold number of lane changes."] and [0061]-[0065]);
accumulating, by the processor, the calculated risk score (see [0019 "...receiving any one or any combination of any two or more of a second identifier, a second driving characteristic, and a second appearance characteristic from a server or a vehicle near the host vehicle..."], [0061 "In response to the difference exceeding a threshold speed difference, the autonomous driving apparatus 10 increases the risk of the target vehicle."], [0087 "The autonomous driving apparatus 10 a of the host vehicle 301 generates the identifier of the target vehicle 303 based on the appearance characteristic of the target vehicle 303, and receives any one or any combination of any two or more of a second identifier, a second driving characteristic, and a second appearance characteristic from a server or the vehicle 305 near the host vehicle 301."] and [0095 "In operation 702, the autonomous driving apparatus obtains a second risk corresponding to the generated identifier, among stored identifiers. The second risk is a stored risk, which is distinct from a risk of the target vehicle measured by the autonomous driving apparatus in real time."]); and
determining, by the processor, the risk level based on the accumulated risk score and a plurality of preset reference levels (see [0019 "...updating the risk based on either one or both of the second driving characteristic and the second appearance characteristic, in response to the second identifier being the same as the identifier."], [0061 "The thresholds may be predefined, and may vary and apply based on an intended operational design."], [0063 "For example, the autonomous driving apparatus 10 updates a value of risk to a greater value, and controls the host vehicle 101 based on the updated risk."], [0076 "The risk of the target vehicle is measured at points, for example, at a real number value out of a possible 1 point or at a discrete level."], [0087 "In response to the generated identifier being the same as the received second identifier, the autonomous driving apparatus 10 a updates the risk of the target vehicle 303 based either one or both of the second driving characteristic and the second appearance characteristic. The autonomous driving apparatus 10 a measures or updates the risk of the target vehicle 303 by combining the data obtained from the camera or sensor 12 of the host vehicle 301 and the data received from the nearby vehicle 305 or server."] and [0095 "In operation 703, the autonomous driving apparatus updates the risk of the target vehicle based on the obtained second risk. The autonomous driving apparatus updates the risk of the target vehicle by applying weights to the stored second risk and the risk measured in real time."]).
Regarding Claims 7 and 18
Modified Hyun teaches the method of claim 6 and the autonomous vehicle of claim 17 (as discussed above in claims 6 and 17),
Hyun further teaches further comprising:
applying, by the processor, a weight to the risk score calculated correspondingly to the departure from the second driving status criterion (see [0057], [0095 "The autonomous driving apparatus updates the risk of the target vehicle by applying weights to the stored second risk and the risk measured in real time."] and [0112 "A case in which the risk is to increase includes a case in which the average speed is relatively high, sudden acceleration is performed a number of times, or sudden deceleration is performed a number of times, a case in which the distance from the vehicle ahead is excessively short, or a number of lane changes are performed ... The risk is calculated by applying respective weights to the cases."]).
Regarding Claims 9 and 20
Modified Hyun teaches the method of claim 6 and the autonomous vehicle of claim 17 (as discussed above in claims 6 and 17),
Hyun further teaches further comprising:
in response to a departure of the at least one target vehicle from a preset safety speed or a preset safety distance, setting, by the processor, the at least one target vehicle as a danger level (see [0058] and [0062 "The autonomous driving apparatus 10 compares the average speed of the target vehicle to a threshold average speed. The autonomous driving apparatus 10 determines the risk of the target vehicle by classifying the target vehicle as a dangerous vehicle in response to the average speed of the target vehicle being greater than the threshold average speed, and classifying the target vehicle as a non-dangerous vehicle in response to the average speed of the target vehicle being less than the threshold average speed."]).
Regarding Claim 10
Modified Hyun teaches the method of claim 1 (as discussed above in claim 1),
Hyun further teaches wherein:
the recognizing of the at least one vehicle includes recognizing a plurality of vehicles driving around the autonomous vehicle within a preset reference range (see Figs. 1A-1B and 3A-3B, all, especially ""vehicles 302/303" and "limited range 304" in Fig. 3A; [0054 "Referring to a scenario 100 of FIG. 1A, the host vehicle 101 is performing autonomous driving using the autonomous driving apparatus 10, and the autonomous driving apparatus 10 senses vehicles 102 and 103 near the host vehicle 101."]-[0055] and [0080 "Referring to a scenario 310 of FIG. 3A, the autonomous driving apparatus 10 a of a host vehicle 301 measures risks of nearby vehicles 302 and 303 within a range 304 defined based on a location of the host vehicle 301."]);
the setting of the at least one target vehicle includes setting a plurality of target vehicles by assigning an ID to each of the plurality of vehicles (see [0056 "The autonomous driving apparatus 10 identifies each of the nearby vehicles 102 and 103 based on the data collected from the sensor or camera 12"], [0083] and [0092 "The autonomous driving apparatus 10 c of the host vehicle 601 obtains identifiers of nearby vehicles."]);
the analyzing of the driving pattern includes analyzing a driving pattern of each of the plurality of target vehicles (see [0055 "At least one sensor or camera 12 of the host vehicle 101 senses the vehicles 102 and 103 near the host vehicle 101, for example, vehicles ahead, behind, and on both sides of the host vehicle 101. The autonomous driving apparatus 10 generates and stores a driving characteristic or an appearance characteristic of a nearby vehicle 102/103 based on data collected from the sensor and/or camera 12."]-[0057] and [0061]-[0067],); and
the determining of the risk level includes determining a risk level with respect to each of the plurality of target vehicles based on a result of the analyzing of the corresponding driving pattern (see [0055]-[0057 "The autonomous driving apparatus 10 determines a risk of (e.g., a risk or danger presented by) the nearby vehicle 102/103 based on the appearance characteristic of the nearby vehicle 102/103, the driving characteristic of the nearby vehicle 102/103 ... a case in which a production year of the nearby vehicle 102/103 is long time prior to the current date, and calculates the risk of the nearby vehicle 102/103 by applying predefined weights to determination results."], [0060]-[0067], [0077 "The autonomous driving apparatus 10 measures points of risks of nearby vehicles 102 and 103 near the host vehicle, and assigns weights to the nearby vehicles 102 and 103 based on the points of the risks."], [0083] and [0092]-[0095]).
Claims 4 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Hyun (as modified by Choi, Heilbron and Okuda) as applied to claims 1 and 12 above, and further in view of Ucar et al. (US 20240166244 A1 and Ucar hereinafter).
Regarding Claims 4 and 15
Modified Hyun teaches the method of claim 1 and the autonomous vehicle of claim 12 (as discussed above in claims 1 and 12),
Hyun further teaches wherein the first driving status criterion is set based on an absolute value change in at least one of a longitudinal position, speed, or acceleration of the at least one target vehicle (see [0007], [0008 "The driving characteristic may include variances in a speed of the target vehicle. The determining of the risk of the target vehicle may include determining the risk of the target vehicle based on whether a number of times that the variances exceed a threshold variance during a defined time period exceeds a defined value."], [0009], [0055]-[0057 "For example, the autonomous driving apparatus 10 determines, based on predefined criteria associated with the nearby vehicle 102/103, a case in which an average speed of the nearby vehicle 102/103 during a predefined time period is relatively high, sudden acceleration is performed a number of times by the nearby vehicle 102/103, or sudden deceleration is performed a number of times by the nearby vehicle 102/103, a case in which a distance between the nearby vehicle 102/103 and a vehicle ahead of the nearby vehicle 102/103 is relatively short'], [0061]-[0062], [0063 "The autonomous driving apparatus 10 analyzes the stored graph, and counts a number of times that a difference between the speed of the target vehicle and the speed of the vehicle near the target vehicle exceeds a threshold speed difference during a predefined time period."]-[0065] and [0112]) and a relative value in at least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle (see [0007]-[0009], [0057], [0063 "The autonomous driving apparatus 10 stores a graph of the speed of the target vehicle and the speed of the vehicle near the target vehicle with respect to time. The autonomous driving apparatus 10 analyzes the stored graph, and counts a number of times that a difference between the speed of the target vehicle and the speed of the vehicle near the target vehicle exceeds a threshold speed difference during a predefined time period. In response to the counted number of times exceeding a defined or predefined value, the autonomous driving apparatus 10 updates the risk of the target vehicle to a new value."]-[0065 "The autonomous driving apparatus 10 senses relative positions of the host vehicle 101 and a nearby vehicle 102/103 near the host vehicle using the sensor or camera 12 of the host vehicle 101. The autonomous driving apparatus 10 calculates and stores the distance between the target vehicle and the vehicle ahead of the target vehicle based on the sensed relative positions. The autonomous driving apparatus 10 compares the distance between the target vehicle and the vehicle ahead of the target vehicle to distances between the nearby vehicles 102, 103 and vehicles ahead of the nearby vehicles 102, 103, and determines the risk of the target vehicle based on results of the comparison. For example, in response to a difference between the distance between the target vehicle and the vehicle ahead of the target vehicle and an average or intermediate value of the distances between the nearby vehicles and the vehicles ahead of the nearby vehicles being less than a threshold difference, the autonomous driving apparatus 10 increases the risk of the target vehicle."]).
Although it may be implied and/or obvious, Hyun does not explicitly teach a relative value change in at the least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle.
Ucar teaches a method of controlling an autonomous vehicle comprising a processor (see all Figs,; [0004]), the method comprising:
analyzing, by the processor, a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior (see Fig. 4A, all; [0025]-[0026], [0029 "For example, the detection system 170 determines that a follower vehicle is distracted from erratic deviations (e.g., spikes, dips, etc.) in following distances (e.g., distant-to-short). In one approach, the detection system 170 classifies the follower vehicle as distracted if the follower vehicle has a driving pattern that deviates above a threshold within a time frame. This analysis may involve forming a time-series with observation data for determining a pattern in the following distance."], [0036] and [0042 "For example, the detection system 170 determines that a follower vehicle is distracted from spikes and dips in the following distance above a threshold within a time window."]);
wherein the first driving status criterion is set based a relative value change in at least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle (see Fig. 4A, all; [0025 "The classifications 240 may define behaviors about driving scenarios and abnormalities from observation data acquired in the sensor data 250. For example, an abnormal classification results from a sudden change in the observation data (e.g., camera data, LIDAR data, etc.) for a separation distance, a following distance, speed, relative speed, and so on between the target or the nearby vehicles."]-[0026], [0029 "For example, the detection system 170 determines that a follower vehicle is distracted from erratic deviations (e.g., spikes, dips, etc.) in following distances (e.g., distant-to-short). In one approach, the detection system 170 classifies the follower vehicle as distracted if the follower vehicle has a driving pattern that deviates above a threshold within a time frame. This analysis may involve forming a time-series with observation data for determining a pattern in the following distance."], [0036 "In FIG. 4A, the detection system 170 acquires distance measurements between the ego vehicle 1001 and the follower vehicle 410 and the preceding vehicle 420 over discrete observation times (e.g., per second). In one approach, the detection system 170 compares the distance measurements in a time-series using locations and predicts or detects abnormal events from abrupt deviations by deriving driving patterns. In certain driving scenarios, the abnormal driving detection 320 identifies observations of abnormal driving and initially classifies the observations as distracted driving 430 by the follower vehicle. Here, the separation distance between observation times 15 s-25 s is above 12 m between follower and ego vehicles for the distracted driving 430."] and [0042 "For example, the detection system 170 determines that a follower vehicle is distracted from spikes and dips in the following distance above a threshold within a time window."]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/autonomous vehicle of modified Hyun to set a first driving status criterion based on a relative value change in at least one of the longitudinal position, speed, or acceleration with respect to a vehicle in front of the at least one target vehicle, as taught by Ucar, in order to detect erratic behaviors of the target vehicle and therefore detect a distracted driver.
Claims 8 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Hyun (as modified by Choi, Heilbron and Okuda) as applied to claims 6 and 17 above, and further in view of Bacchus et al. (US 20190337451 A1 and Bacchus hereinafter).
Regarding Claims 8 and 19
Modified Hyun teaches the method of claim 6 and the autonomous vehicle of claim 17 (as discussed above in claims 6 and 17),
Hyun further teaches further comprising:
in response to determination that the risk level is a danger level among the plurality of preset reference levels, recognizing, by the processor, a license plate of the at least one target vehicle and storing the recognized license plate (see [0018 "The autonomous driving method may further include: generating an identifier of the target vehicle based on the appearance characteristic of the target vehicle; and transmitting any one or any combination of any two or more of the identifier, the driving characteristic, the appearance characteristic, and the risk of the target vehicle to a vehicle near the host vehicle. The identifier may include any one or any combination of any two or more of a license plate, a type, and a color of the target vehicle."], [0058 "Referring to a scenario 110 of FIG. 1B, among the vehicles 102 and 103 near the host vehicle 101, the nearby vehicle 103 is determined to have a risk greater than a defined or predefined criterion. In this example, the autonomous driving apparatus 10 controls the host vehicle 101 based on the risk of the nearby vehicle 103. For example, the autonomous driving apparatus 10 classifies the nearby vehicle 103 as an aggressive vehicle."], [0062 "The autonomous driving apparatus 10 compares the average speed of the target vehicle to a threshold average speed. The autonomous driving apparatus 10 determines the risk of the target vehicle by classifying the target vehicle as a dangerous vehicle in response to the average speed of the target vehicle being greater than the threshold average speed..."] and [0083 "A risk of a target vehicle 303 is shared while being combined with any one or any combination of any two or more of a license plate, a location, a type such as a model ".]).
Hyun is silent regarding in response to re-detection of the at least one target vehicle after departing from the preset reference range, setting, by the processor, a previous danger level as a current danger level for the re-detected target vehicle.
Bacchus teaches a method of controlling an autonomous vehicle comprising a processor (see all Figs.; [0003]), the method comprising:
recognizing, by the processor, at least one vehicle driving around the autonomous vehicle within a preset reference range (see [0003 "An example system includes one or more sensors that measure one or more attributes of a remote object in a predetermined vicinity of the vehicle."], [0072]-[0075] and [0090]);
setting, by the processor, at least one target vehicle by assigning an identifier (ID) to at least one among the at least one vehicle (see [0080] and [0090 "For example, the RDMS 850 keeps track of a sequence of attributes such as identifiers, positions, velocities, etc. for the remote vehicle 1110."] and [0106 "The recklessness score 1045 is stored mapped with one or more identifiers of the remote vehicle 1110, for example, license plate number, barcode, or any other identifier associated with the remote vehicle 1110."]);
analyzing, by the processor, a driving pattern of the at least one target vehicle based on a driving status criterion, the driving status criterion being set based on whether a directional behavior of the at least one target vehicle is a longitudinal behavior or a lateral behavior (see [0003 "An example system includes one or more sensors that measure one or more attributes of a remote object in a predetermined vicinity of the vehicle."], [0006 "The attributes of the remote object include a lateral variability of the remote object that is determined based on a deviation of the remote object within a lane of a road along which the remote object is traveling. The attributes of the remote object include abrupt braking by the remote object that is determined based on a maximum deceleration of the remote object within a predetermined time window. The attributes of the remote object include a number of lane changes by the remote object within a predetermined time window. The attributes of the remote object include a tailgating distance determined for the remote object with respect to a second remote object."] and [0090]-[0099]);
determining, by the processor, a risk level with respect to the at least one target vehicle based on a result of the analyzing (see [0003 "Generating the driver notification includes determining a recklessness score for the remote object based on the attributes of the remote object."] and [0100]-[0109]), and
displaying, by the processor, a result of the risk level through a display unit equipped in the autonomous vehicle or outputting a warning sound or a message (see [0004]-[0006] and [0105 "Alternatively, if the recklessness score is above the predetermined threshold, the method 1000 includes generating and providing an alert about the remote vehicle 1110 to the driver, at 1060. The alert can include a spatial awareness alert that is includes a directional information of the location of the remote vehicle 1110 to the driver along with an intensity of the alert being based on the recklessness score that is computed. The mapping of the recklessness score is performed as described herein. The alert can be provided via the haptic alert device 120, the display device 860, and/or the acoustic system 870 that are part of the augmented reality system 800. In one or more examples, the remote vehicle 1110 may be highlighted in the display device 860 along with directional information being provided via the haptic alert device 120 and/or the acoustic system 870."]),
wherein the driving status criterion comprises a first driving status criterion with respect to the longitudinal behavior (see [0006 "The attributes of the remote object include abrupt braking by the remote object that is determined based on a maximum deceleration of the remote object within a predetermined time window ... The attributes of the remote object include a tailgating distance determined for the remote object with respect to a second remote object."] and [0095]-[0096]) and a second driving status criterion with respect to the lateral behavior (see [0006 "The attributes of the remote object include a lateral variability of the remote object that is determined based on a deviation of the remote object within a lane of a road along which the remote object is traveling ... The attributes of the remote object include a number of lane changes by the remote object within a predetermined time window."], [0090]-[0094] and [0098]-[0099]);
further comprising:
in response to a departure of the directional behavior of the at least one target vehicle from the first driving status criterion or the second driving status criterion, calculating, by the processor, a risk score (see [0003 "Generating the driver notification includes determining a recklessness score for the remote object based on the attributes of the remote object."], [0005 "In one or more examples, determining the recklessness score includes receiving a prior recklessness score of the remote object based on an identification of the remote object, and updating the prior recklessness score using the attributes of the remote object received from the one or more sensors. In one or more examples, storing the updated recklessness score for the remote object to be accessed by a second vehicle."] and [0107]-[0108]);
accumulating, by the processor, the calculated risk score (see [0005 "In one or more examples, determining the recklessness score includes receiving a prior recklessness score of the remote object based on an identification of the remote object, and updating the prior recklessness score using the attributes of the remote object received from the one or more sensors."] and [0107]-[0108 "Accordingly, in this case the updating can use a weighted average between the presently computed recklessness score 1045 and the previous recklessness score of the remote vehicle 1110 from the memory device 815."]); and
determining, by the processor, the risk level based on the accumulated risk score and a plurality of preset reference levels (see [0005 "In one or more examples, determining the recklessness score includes receiving a prior recklessness score of the remote object based on an identification of the remote object, and updating the prior recklessness score using the attributes of the remote object received from the one or more sensors."] and [0107]-[0108 "Accordingly, in this case the updating can use a weighted average between the presently computed recklessness score 1045 and the previous recklessness score of the remote vehicle 1110 from the memory device 815."]);
further comprising:
recognizing, by the processor, a license plate of the at least one target vehicle and storing the recognized license plate (see [0080] and [0106 "For example, the recklessness score 1045 of the remote vehicle 1110 is stored in the memory device 815. The recklessness score 1045 is stored mapped with one or more identifiers of the remote vehicle 1110, for example, license plate number, barcode, or any other identifier associated with the remote vehicle 1110. The stored recklessness score 1045 is used for future access."]); and
in response to re-detection of the at least one target vehicle after departing from the preset reference range, setting, by the processor, a previous danger level as a current danger level for the re-detected target vehicle (see [0090]-[0106 "For example, the recklessness score 1045 of the remote vehicle 1110 is stored in the memory device 815. The recklessness score 1045 is stored mapped with one or more identifiers of the remote vehicle 1110, for example, license plate number, barcode, or any other identifier associated with the remote vehicle 1110. The stored recklessness score 1045 is used for future access. For example, if the remote vehicle 1110 is observed in the vicinity of the vehicle 10 at a future time (e.g. next day, week, month, or the like), the recklessness score 1045 of the remote vehicle 1110 can be accessed from the memory device 815 and an alert can be generated."]).
It would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to further modify the process/autonomous vehicle of modified Hyun to re-detect the at least one target vehicle after departing from the preset reference range and set a previous danger level as a current danger level for the re-detected target vehicle, as taught by Bacchus, in order to store data on a reckless target vehicle and to alert the driver when it is re-detected.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TANNER LUKE CULLEN whose telephone number is (303)297-4384. The examiner can normally be reached Monday-Friday 9:00-5:00 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TANNER L CULLEN/Examiner, Art Unit 3656
/KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656