DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-13, 15 are presented for examination.
Claims 1-13, 15 are rejected.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-13, 15 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis – Step 1
Claim 1 is directed to “A device…”, claim 13 is directed to “A method…”. Therefore, claims 1, 13 are within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong I
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 13 includes limitations that recite an abstract idea (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. The other analogous claims 1, 13 are rejected for the same reasons as the representative claim 13 as discussed here. Claim 13 recites:
“A method of initiating a driving maneuver comprising the steps of: receiving environmental sensor data form environmental sensors, including information about a first road user and a second road user in an environment of an autonomous or partially autonomous vehicle and information about objects in the environment of the autonomous or partially autonomous vehicle; determining whether a collision between the first road user and the second road user is imminent based on the environmental sensor data; scheduling a driving maneuver of the autonomous or partially autonomous vehicle based on the environmental sensor data when a collision is imminent, wherein the driving maneuver provides an evasive possibility for the first road user and/or the second road user to avoid the collision; and outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle.”
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. For example, “determining…scheduling…” all the various data in the context of this claim encompasses a person looking at data collected (received, detected, determined, identified, and analyzed etc.) and forming a simple judgement (determination, analysis, comparison, and judgement etc.) either mentally or using a pen and paper. Accordingly, the claim recites at least one abstract idea. The Examiner notes that under MPEP 2106.04(a)(2)(III), the courts consider a mental process (thinking) that "can be performed in the human mind, or by a human using a pen and paper" to be an abstract idea. CyberSource Corp. v. Retail Decisions, Inc., 654 F.3d 1366, 1372, 99 USPQ2d 1690, 1695 (Fed. Cir. 2011). As the Federal Circuit explained, "methods which can be performed mentally, or which are the equivalent of human mental work, are unpatentable abstract ideas the ‘basic tools of scientific and technological work’ that are open to all.’" 654 F.3d at 1371, 99 USPQ2d at 1694 (citing Gottschalk v. Benson, 409 U.S. 63, 175 USPQ 673 (1972)). See also Mayo Collaborative Servs. v. Prometheus Labs. Inc., 566 U.S. 66, 71, 101 USPQ2d 1961, 1965 ("‘[M]ental processes[] and abstract intellectual concepts are not patentable, as they are the basic tools of scientific and technological work’" (quoting Benson, 409 U.S. at 67, 175 USPQ at 675)); Parker v. Flook, 437 U.S. 584, 589, 198 USPQ 193, 197 (1978) (same).
101 Analysis – Step 2A, Prong II
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
“A method of initiating a driving maneuver comprising the steps of: receiving environmental sensor data form environmental sensors, including information about a first road user and a second road user in an environment of an autonomous or partially autonomous vehicle and information about objects in the environment of the autonomous or partially autonomous vehicle; determining whether a collision between the first road user and the second road user is imminent based on the environmental sensor data; scheduling a driving maneuver of the autonomous or partially autonomous vehicle based on the environmental sensor data when a collision is imminent, wherein the driving maneuver provides an evasive possibility for the first road user and/or the second road user to avoid the collision; and outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle.”
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitations above, the examiner submits that these limitations are insignificant extra-solution activities that merely use a computer (processor) to perform the process. In particular, the “receiving environmental sensor data” steps from / using sensor system(s) are recited at a high level of generality (i.e. as a general means of receiving environmental sensor data, and other steps), and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The “provides an evasive possibility for the first road user and/or the second road user to avoid the collision” steps are also recited at a high level of generality and amounts to mere post solution action, which is a form of insignificant extra-solution activity. Lastly, claims 1, 13 further recite “outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle.” merely describes how to generally “apply” the otherwise mental judgements in a generic or general purpose vehicle control environment. See Alice Corp. Pty. Ltd. v. CLS Bank Int'l, 573 U.S. at 223 (“[T]he mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention.”). The device(s) and processor(s) are recited at a high level of generality and merely automates the steps. In order to expedite prosecution, Examiner also notes that the mere recitation of “outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle.” in claims 1, 13 are not significant enough to integrate the judicial exception into a practical application since the claims do not include a positive recitation of “controlling the autonomous or partially autonomous vehicle to avoid the collision…” (if supported by the specification, such limitation is an example of a significant enough limitation to integrate the judicial exception into a practical application).
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
101 Analysis – Step 2B
Regarding Step 2B of the 2019 PEG, representative independent claim 13 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using an input interface, environmental sensor; an evaluation unit, a scheduling unit, and the other steps amount to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations discussed above are insignificant extra-solution activities.
The additional limitations of “…wherein the driving maneuver provides an evasive possibility for the first road user and/or the second road user to avoid the collision…” steps are well-understood, routine and conventional activities because the background recites that the sensors are all conventional sensors, and the specification does not provide any indication that the processor is anything other than a conventional computer. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner. The additional limitation of “…outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle.” is a well-understood, routine, and conventional activity because the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere performance which in the instant application is warning to an operator is a well understood, routine, and conventional function. Hence, the claim is not patent eligible.
Dependent claim(s) 2-12, 15 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-12, 15 do are not patent eligible under the same rationale as provided for in the rejection of claims 1, 13.
Therefore, claim(s) 1-20 are ineligible under 35 USC §101.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 4, 7, and 15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claims 4, 7, and 15 are rejected based on the lack of antecedent basis as follows:
4. The device according to claim 2, wherein the evaluation unit is configured for determining an increased probability of collision when there is no line of sight.
7. The device according to claim 1, wherein the scheduling unit is designed to plan an evasive process by which an evasion of the autonomous or partially autonomous vehicle to a side strip of a roadway is effected.
15. The device according to claim 3, wherein the evaluation unit is configured for determining an increased probability of collision when there is no line of sight.
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims 1-13, 15 in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “unit”.
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or 35 U.S.C. 112 (pre-AIA ), sixth paragraph limitations: “In FIG. 2 a device according to the disclosure 12 for initiating a driving maneuver is shown schematically. The device 12 comprises an input interface 26, an evaluation unit 28, a planning unit 30 as well as an output unit 32. The units and interfaces can be partially or completely converted into software and/or hardware. For example, the units can be designed as a processor, processor modules or also as software for a processor. The device 12 can be designed in the form of a control unit or a central computer of an autonomous or partially autonomous vehicle or as software for such a control unit or a central computer of an autonomous or partially autonomous vehicle…”, as disclosed in ¶ [0046]-¶ [0051] of the specification.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-2, 6-7, and 9-13 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ku et al. (US Pub. No.: 2023/0166726 A1: hereinafter “Ku”).
Consider claims 1, 12-13:
Ku teaches a device (Fig. 1 elements “…a vehicle driving control apparatus 10…a sensor 110 and a processor 120, where the processor 120 is coupled to the sensor 110…”), a system (Figs. 1-2 elements 10-120, Steps S210-S230), a method of initiating a driving maneuver (See Ku, e.g., “…vehicle driving control apparatus and a control method and a display method…includes a sensor and a processor…detects current relative position and current relative velocity of an object around a vehicle…calculates a collision probability between the vehicle and the object…determines whether to adjust a driving dynamics of the vehicle based on the collision probability…”, of Abstract, ¶ [0005]-¶ [0009], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740) comprising the steps of: receiving environmental sensor data form environmental sensors (See Ku, e.g., “…the vehicle driving control apparatus 10 may receive a plurality of sensing data and map information through the sensor 110…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0028], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740), including information about a first road user and a second road user in an environment of an autonomous or partially autonomous vehicle (e.g., “…the types of the objects around the vehicle…”, of Figs. 1-2 elements 10-120) and information about objects in the environment of the autonomous or partially autonomous vehicle (See Ku, e.g., “…the vehicle driving control apparatus 10 may receive a plurality of sensing data and map information through the sensor 110…calculate the relative positions and the relative position variation amounts of a future time point based on the current relative positions, current relative velocities and the types of the objects around the vehicle…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0028], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740); determining whether a collision between the first road user and the second road user is imminent based on the environmental sensor data (See Ku, e.g., “…in step S220, the processor 120 receives the current relative position and the current relative velocity of the at least one object from the sensor 110…calculates at least one collision probability between the vehicle and the at least one object based on the current relative position and the current relative velocity of the at least one object…based on the current relative position and the current relative velocity of the street lamp…calculate that the collision probability between the vehicle and the street lamp after 1 second is 70%.…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740); scheduling (e.g., adjusting a driving dynamics based on the at least one collision probability of Fig. 2 Steps S210-S230) a driving maneuver of the autonomous or partially autonomous vehicle based on the environmental sensor data when a collision is imminent (See Ku, e.g., “…In step S230, the processor 120 determines whether to adjust a driving dynamics of the vehicle according to the at least one collision probability…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740), wherein the driving maneuver provides an evasive possibility for the first road user and/or the second road user to avoid the collision (See Ku, e.g., “…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740); and outputting the driving maneuver to a vehicle control unit of the autonomous or partially autonomous vehicle (See Ku, e.g., “…determines whether to adjust a driving dynamics of the vehicle according to the at least one collision probability…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor…reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 2:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the evaluation unit is adapted to determine a probability of collision (See Ku, e.g., “…calculates at least one collision probability between the vehicle and the at least one object based on the current relative position and the current relative velocity of the at least one object…based on the current relative position and the current relative velocity of the street lamp…calculate that the collision probability between the vehicle and the street lamp after 1 second is 70%.…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740); and to compare the probability with a predefined threshold value (See Ku, e.g., “…compare the at least one collision probability P.sub.collision (tc) with a second probability threshold P.sub.TH2, and take an object relative position corresponding to the collision probability P.sub.collision (tc) that is greater than or equal to the second probability threshold P.sub.TH2 in the at least one collision probability P.sub.collision (tc) as the expected collision point, while object relative positions corresponding to the collision probabilities P.sub.collision (tc) that are less than the second probability threshold P.sub.TH2 in the at least one collision probability P.sub.collision (tc) are not taken as the expected collision point…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 6:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the scheduling unit is designed to plan (e.g., adjusting a driving dynamics based on the at least one collision probability of Fig. 2 Steps S210-S230) a braking action and/or an evasive action (See Ku, e.g., “…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740), by which a first vehicle passing the autonomous or partially autonomous vehicle is given a possibility of steering in front of the autonomous or partially autonomous vehicle (e.g., “…if the object itself has acceleration/deceleration capability, such as a bicycle or a car, an extreme value range of acceleration/deceleration of the object itself may be considered to generate the collision probability…”, of Figs. 3-4 elements Steps S310-S460) to avoid a collision with an oncoming second vehicle (See Ku, e.g., “…In step S230, the processor 120 determines whether to adjust a driving dynamics of the vehicle according to the at least one collision probability…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 7:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the scheduling unit is designed to plan (e.g., adjusting a driving dynamics based on the at least one collision probability of Fig. 2 Steps S210-S230) an evasive process by which an evasion of the autonomous or partially autonomous vehicle to a side strip of the roadway (e.g., “…The curb 820 and the object 840 may be sensed by the sensor 110 and acquired by the processor 120 through recognition, and may also be acquired by the processor 120 from map information for display…”, of Figs. 6-8 elements 810-862, Steps S610-S740) is affected (See Ku, e.g., “…In step S230, the processor 120 determines whether to adjust a driving dynamics of the vehicle according to the at least one collision probability…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 9:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the evaluation unit is designed to ignore data from direct communication (e.g., the collision probabilities are determined based on the sensor data of Figs. 1-2 elements 10-120, Steps S210-S230) with the first road user and/or the second road user (See Ku, e.g., “…detects current relative position and current relative velocity of an object around a vehicle…calculates a collision probability between the vehicle and the object…determines whether to adjust a driving dynamics of the vehicle based on the collision probability…”, of Abstract, ¶ [0005]-¶ [0009], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 10:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the input interface is designed to receive the environmental sensor data from a camera, radar and/or lidar sensor (See Ku, e.g., “…the sensor 110 may include a camera, a LiDAR, a radar, an accelerometer, a gyroscope, a weather sensor, a wheel speedometer…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Consider claim 11:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches wherein the output unit is designed to control a steering and an acceleration and braking system of the autonomous or partially autonomous vehicle (See Ku, e.g., “…determine whether to adjust a velocity of the vehicle to prevent a collision according to the collision probability…to brake to reduce the velocity to prevent collision with an object, or provide an auxiliary torque to assist a steering motor (not shown) to drive the vehicle to perform steering, such as reducing a steering angle to prevent collision with the object…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 3-5, 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ku in view of Kim et al. (US Pub. No.: 2019/0256087 A1: hereinafter “Kim”).
Consider claim 3:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches “…detects current relative position and current relative velocity of an object around a vehicle…calculates a collision probability between the vehicle and the object…determines whether to adjust a driving dynamics of the vehicle based on the collision probability…”, of Abstract, ¶ [0005]-¶ [0009], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740. However, Ku does not explicitly teach wherein the evaluation unit is configured to determine whether there is a visual link based on the environmental sensor data between the first road user and the second road user.
In an analogous field of endeavor, Kim teaches wherein the evaluation unit (e.g., The controller 170 of Fig. 7) is configured to determine whether there is a visual link based on the environmental sensor data between the first road user and the second road user (See Kim, e.g., “…determine a possibility of collision between the first object and the second object based on field-of-view information of the first object…The controller 170 may acquire field-of-view information of another vehicle based on positions at the vehicle, the first object, and the second object are located on the 3D map. Based on the positions at which the vehicle, the first object, and the second object are located on the 3D map, the controller 170 may determine whether the second object appears within the field of view of the another vehicle…”, of ¶ [0262]-¶ [0273], Figs. 8-11 elements 100-1130, Steps S910-S931).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine “…vehicle driving control apparatus and a control method and a display method…includes a sensor and a processor…detects current relative position and current relative velocity of an object around a vehicle…calculates a collision probability between the vehicle and the object…determines whether to adjust a driving dynamics of the vehicle based on the collision probability…”, as disclosed in Ku with “wherein the evaluation unit is configured to determine whether there is a visual link based on the environmental sensor data between the first road user and the second road user.”, as taught in Kim with a reasonable expectation of success to yield a system, method for efficiently, robustly, and seamlessly “…to minimize a possibility of occurrence of an accident, by controlling the host vehicle to operate based on motion of an object...”, as taught in ¶ [0005].
Consider claim 4:
Ku teaches everything claimed as implemented above in the rejection of claim 1. However, Ku does not explicitly teach wherein the evaluation is configured for determining an increased probability of collision when there is no line of sight.
In an analogous field of endeavor, Kim teaches wherein the evaluation is configured for determining an increased probability of collision when there is no line of sight (See Kim, e.g., “…determine a possibility of collision between the first object and the second object based on field-of-view information of the first object…The controller 170 may acquire field-of-view information of another vehicle based on positions at the vehicle, the first object, and the second object are located on the 3D map. Based on the positions at which the vehicle, the first object, and the second object are located on the 3D map, the controller 170 may determine whether the second object appears within the field of view of the another vehicle…”, it is evident that the probability of collision increases when there is no LOS, of ¶ [0262]-¶ [0273], Figs. 8-11 elements 100-1130, Steps S910-S931).
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify Ku with the teachings of Kim so as to, with a reasonable expectation of success, yield a system, method for efficiently, robustly, and seamlessly mitigating the collisions, thereby, preserving precious lives.
Consider claim 5:
The combination of Ku, Kim teaches everything claimed as implemented above in the rejection of claim 3. Kim teaches wherein the evaluation unit in the autonomous or partially autonomous vehicle for determining whether a line of sight between the first road user and the second road user is blocked (See Kim, e.g., based on the consequential evidence, it is obvious that the probability of collision increases when there is no LOS, of ¶ [0262]-¶ [0273], Figs. 8-11 elements 100-1130, Steps S910-S931); and the scheduling unit is adapted to plan an avoidance operation by the autonomous or partially autonomous vehicle which the line of sight between the first road user and the second road user is restored (See Kim, e.g., “…detect an object in a vicinity of the vehicle…control driving of at least one of a power source, a brake apparatus, or a steering apparatus in the vehicle…acquire motion information of the object, and provide a signal to the vehicle drive apparatus based on the motion information to control driving of the at least one of the power source, the brake apparatus, or the steering apparatus..”, of Abstract, ¶ [0262]-¶ [0273], ¶ [0331]-¶ [0345], Figs. 8-11 elements 100-1130, Steps S910-S931). Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify Ku with the teachings of Kim so as to, with a reasonable expectation of success, yield a system, method for efficiently, robustly, and seamlessly preventing the collisions.
Consider claim 15:
The combination of Ku, Kim teaches everything claimed as implemented above in the rejection of claim 3. In addition, claim 15 is analyzed and thus rejected with respect to the same reasonings as implemented in the rejection of claim 4.
Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ku in view of PARK et al. (US Pub. No.: 2022/0194423 A1: hereinafter “PARK”).
Consider claim 8:
Ku teaches everything claimed as implemented above in the rejection of claim 1. In addition, Ku teaches “…calculates at least one collision probability between the vehicle and the at least one object based on the current relative position and the current relative velocity of the at least one object…based on the current relative position and the current relative velocity of the street lamp…calculate that the collision probability between the vehicle and the street lamp after 1 second is 70%.…”, of ¶ [0005]-¶ [0009], ¶ [0021]-¶ [0029], ¶ [0037]-¶ [0047], Figs. 1-2 elements 10-120, Steps S210-S230, Figs. 3-4 elements Steps S310-S460, and Figs. 6-8 elements 810-862, Steps S610-S740. However, Ku does not explicitly teach wherein the evaluation unit is designed to determine whether a collision is imminent, based on a pretrained artificial neural network.
In an analogous field of endeavor, PARK teaches wherein the evaluation unit is designed to determine whether a collision is imminent, based on a pretrained artificial neural network (See PARK, e.g., “…determining a priority of each of a plurality of neural networks executing on a vehicle processing system based on a contribution of each neural network to overall vehicle safety performance, and allocating computing resources to the plurality of neural networks based on the determined priority of each neural network…dynamically adjust hyperparameters of one or more neural networks…neural networks performing functions for maneuvering and collision avoidance may be more important for ensuring safe vehicle operations moment-to-moment than a neural network for dynamic re-routing based on traffic density…”, of Abstract, ¶ [0003]-¶ [0007], ¶ [0024], Figs. 4-6B elements 400-540, steps 600-614).
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify Ku with the teachings of PARK so as to, with a reasonable expectation of success, yield a system, method for efficiently, robustly, and seamlessly “To enable rapid analysis of the sensor data and quick decision making based on it, the data from each sensor is processed by a neural network, and so the computing systems of the vehicle…execute numerous neural networks concurrently.”, as taught ¶ [0001].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Takaki et al. (US Pub. No.: 2021/0166564 A1) teaches “System, methods, and other embodiments described herein relate to providing a warning from a subject vehicle to surrounding objects about a collision hazard. In one embodiment, a method includes identifying the surrounding objects of a subject vehicle according to sensor data about a surrounding environment of the subject vehicle. The method includes determining a collision probability indicating a likelihood of collision between a first object and a second object of the surrounding objects. The method includes, in response to the collision probability satisfying a collision threshold, communicating, by the subject vehicle, an alert to at least one of the surrounding objects about the collision hazard associated with the surrounding objects colliding.”
RECKZIEGEL et al. (DE 102019205802 A1) teaches “The present invention creates a method for operating a driver assistance system (10) comprising operating (S1) at least one sensor device (Sil; ...; Sin) in an own vehicle (EF) for monitoring the surroundings of the own vehicle (EF); a recognition (S2) of a high-speed vehicle (SF) which is traveling on a first lane (X1), which is next to a second lane (X2) on which the own vehicle (EF) is traveling, and determining (S2a) a traveling speed (Vsf ) of the high-speed vehicle (SF) in the direction of travel (x) of the own vehicle (EF) based on a travel speed (V0) of the own vehicle (EF) or of a vehicle (4) behind it or of a vehicle (3) ahead by the sensor device (Sil;. ..; Sin) and by a control device (SE); an identification (S3) of a movement of the high-speed vehicle (SF) as an overtaking process of the own vehicle (EF) or the following vehicle (4) or the preceding vehicle (3); an identification (S4) of a change in the movement of the high-speed vehicle (SF) as a termination of the overtaking maneuver.”
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BABAR SARWAR whose telephone number is (571)270-5584. The examiner can normally be reached on Mon-Fri 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached on (313)446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free)? If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BABAR SARWAR/Primary Examiner, Art Unit 3667