DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
Claims 1-20 are presented for examination.
Claims 1-7, 11-17 are rejected.
Claims 8-10, 18-20 are objected to.
Response to Arguments
Applicant's arguments filed 11/25/2025 have been fully considered but they are not persuasive.
The Applicants argued that some of the claimed subject matter is not taught by the prior art on record, i.e., “and controlling, based on the determined avoidance behavior type, lateral movement of the vehicle for collision avoidance without crossing outer edges of a current driving lane.”.
The Examiner kindly steers the Applicants’ attention to the following that prior art on record teaches, e.g., Shah, e.g., “…implement techniques to improve collision prediction and avoidance between a vehicle and objects in an environment…generates a relevance polygon associated with a planned path of the vehicle…identifies objects in the environment and determines whether the objects are located within a boundary of the relevance polygon…determines that the object is relevant to the vehicle and includes data associated therewith in vehicle control planning considerations…”, “…an autonomous (or semi-autonomous) vehicle 102 in an environment 100, wherein a relevance polygon 104 of the autonomous vehicle 102 and a plurality of objects 106…the vehicle computing system determines the potential object interaction(s) based on a determination that an intersection between the predicted object trajectories 124 is within a threshold distance of one or more of the objects 106. In such examples, the vehicle computing system may determine that, absent a modification to one or more of the associated object trajectories 124, the objects 106 may collide…”, and “…the vehicle computing system may determine one or more potential object interactions based on the one or more predicted trajectories 224…the potential object interactions may include potential collisions between objects 206…determine that a first object trajectory 224(1) associated with the first object 206(1) is substantially perpendicular to a second object trajectory 224(2) associated with the third object 206(3)…determine to modify the relevance polygon 204 to include the first object 206(1)…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3)…determine an action for the vehicle 102…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712. On the other hand, TAKAKI teaches, e.g., “…detects an object present ahead of a vehicle in a vehicle traveling direction based on an image acquired by an imaging section…”, “…expanding the determination region Wcd increases the opportunity for the collision lateral position Xpc calculated on the basis of the movement track of the target Ob to be within the determination region Wcd, making PCS easy to activate. In contrast, reducing the determination region Wcd decreases the opportunity for the collision lateral position Xpc to be within the determination region Wcd, making PCS difficult to activate…”, and “…advancing TTC advances the time of starting each operation of PCS, making PCS easy to activate. In contrast, delaying TTC delays the time of starting each operation of PCS, making PCS difficult to activate…The driving assist ECU 20 delays the time of activating each operation to make PCS more difficult to activate as the relative distance Dr acquired in step S13 increases…defines the value of the amount of correction Av2 such that the amount of correction Av2 decreases as the relative distance Dr increases at a certain difference ΔMD in the movement direction. Therefore, steps S15 and S16 function as a changing step…The driving assist ECU 20 makes PCS more difficult to activate as the relative distance Dr acquired in step S13 increases…defines the value of the amount of correction Av1 such that the amount of correction Av1 decreases as the relative distance Dr increases at a certain difference ΔMD in the movement direction...”, of Abstract, ¶ [0025]-¶ [0028], ¶ [0042]-¶ [0067], and Fig. 2 elements Ob, CS, Figs. 4-8 steps S11-S19. The motivation is, with a reasonable expectation of success, to yield “provide a vehicle control apparatus and a vehicle control method capable of preventing unnecessary operation while performing various types of control for improving traveling safety of a vehicle”, as taught in ¶ [0007].
One cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., Inc., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).Therefore, the previous rejection is maintained with elucidations to clarify examiner’s position.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-7, 11-17 is/are rejected under 35 U.S.C. 103 as -being unpatentable over Shah in view of TAKAKI, and in further view of Hartnett.
Consider claims 1, 11:
Shah teaches a vehicle, a method for avoiding collision of a vehicle (See Shah, e.g., “…implement techniques to improve collision prediction and avoidance between a vehicle and objects in an environment…generates a relevance polygon associated with a planned path of the vehicle…identifies objects in the environment and determines whether the objects are located within a boundary of the relevance polygon…determines that the object is relevant to the vehicle and includes data associated therewith in vehicle control planning considerations…” of Abstract, ¶ [0010]-¶ [0018], ¶ [0020]-¶ [0023], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712), comprising: obtaining surrounding environment information comprising o nearby object information or road information (See Shah, e.g., “…an autonomous (or semi-autonomous) vehicle 102 in an environment 100, wherein a relevance polygon 104 of the autonomous vehicle 102 and a plurality of objects 106…the vehicle computing system determines the potential object interaction(s) based on a determination that an intersection between the predicted object trajectories 124 is within a threshold distance of one or more of the objects 106. In such examples, the vehicle computing system may determine that, absent a modification to one or more of the associated object trajectories 124, the objects 106 may collide…” of ¶ [0010]-¶ [0018], ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712); predicting a position change of a nearby object based on the surrounding environment information (See Shah, e.g., “…the perception component 322 may provide processed sensor data that indicates one or more characteristics associated with a detected object (e.g., a tracked object) and/or the environment in which the object is positioned. In some examples, characteristics associated with an object may include, but are not limited to, an x-position (global and/or local position), a y-position (global and/or local position), a z-position (global and/or local position), an orientation (e.g., a roll, pitch, yaw), an object type (e.g., a classification), a velocity of the object…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0085]-¶ [0086], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712); and determining one among a plurality of avoidance behavior types for collision avoidance in a lane as avoidance behavior types of the vehicle (“…At operation 420, the process 400 includes controlling the vehicle based on inclusion of data associated with the object. In various examples, the vehicle computing system may control the vehicle based on inclusion of the object by including the object data associated with the object in vehicle control planning considerations…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3) and/or to determine an action for the vehicle 102…” of Fig. 4 steps 414-424), based on the predicted position change of the nearby object (See Shah, e.g., “…the vehicle computing system may determine one or more potential object interactions based on the one or more predicted trajectories 224…the potential object interactions may include potential collisions between objects 206…determine that a first object trajectory 224(1) associated with the first object 206(1) is substantially perpendicular to a second object trajectory 224(2) associated with the third object 206(3)…determine to modify the relevance polygon 204 to include the first object 206(1)…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3)…determine an action for the vehicle 102…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712).
Shah further teaches “…At operation 420, the process 400 includes controlling the vehicle based on inclusion of data associated with the object. In various examples, the vehicle computing system may control the vehicle based on inclusion of the object by including the object data associated with the object in vehicle control planning considerations…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3) and/or to determine an action for the vehicle 102…” of Fig. 4 steps 414-424. Hence, Shah is teaching “and controlling, based on the determined avoidance behavior type, movement of the vehicle for collision avoidance without crossing outer edges of a current driving lane”. However, Shah does not explicitly teach and controlling, based on the determined avoidance behavior type, lateral movement of the vehicle.
In an analogous field of endeavor, TAKAKI teaches and controlling, based on the determined avoidance behavior type, lateral movement of the vehicle (See TAKAKI, e.g., “A driving assist ECU acquires, based on an image, positions of at least two specific points of an object that are different in a lateral direction with respect to a vehicle traveling direction. The driving assist ECU also performs collision avoidance control for avoiding a collision with the object based on a movement track of the object obtained from a history of the positions of the specific points, and calculates, for each of the specific points, a movement direction of each of the specific points based on the history of the position of each of the specific points. The driving assist ECU then changes how to perform the collision avoidance control based on a difference between the movement directions at the respective specific points….”, “…detects an object present ahead of a vehicle in a vehicle traveling direction based on an image acquired by an imaging section…”, “…expanding the determination region Wcd increases the opportunity for the collision lateral position Xpc calculated on the basis of the movement track of the target Ob to be within the determination region Wcd, making PCS easy to activate. In contrast, reducing the determination region Wcd decreases the opportunity for the collision lateral position Xpc to be within the determination region Wcd, making PCS difficult to activate…”, of Abstract, ¶ [0025]-¶ [0028], ¶ [0042]-¶ [0067], and Fig. 2 elements Ob, CS, Figs. 4-8 steps S11-S19).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to combine “…implement techniques to improve collision prediction and avoidance between a vehicle and objects in an environment…generates a relevance polygon associated with a planned path of the vehicle…identifies objects in the environment and determines whether the objects are located within a boundary of the relevance polygon…determines that the object is relevant to the vehicle and includes data associated therewith in vehicle control planning considerations…” disclosed in Shah with “and controlling, based on the determined avoidance behavior type, lateral movement of the vehicle”, as taught in TAKAKI with a reasonable expectation of success to yield “provide a vehicle control apparatus and a vehicle control method capable of preventing unnecessary operation while performing various types of control for improving traveling safety of a vehicle”, as taught in ¶ [0007].
The combination of Shah, TAKAKI teaches everything claimed as implemented above. However, the combination does not explicitly teach wherein the plurality of avoidance behavior types comprises an evasive steering to right (ESR) type, an evasive steering to left (ESL) type, or a decelerating (DEC) type.
In an analogous field of endeavor, Hartnett teaches wherein the plurality of avoidance behavior types comprises an evasive steering to right (ESR) type, an evasive steering to left (ESL) type, or a decelerating (DEC) type (See Hartnett, e.g., “…in response to analyzing the prediction data, an on-board computing device may execute one or more control instructions that cause the autonomous vehicle to decelerate or brake at an intersection in order to yield to the object…If the risk exceeds an acceptable threshold, it may determine whether the collision can be avoided if the autonomous vehicle follows a defined vehicle trajectory and/or implements one or more dynamically generated emergency maneuvers is performed in a pre-defined time period (e.g., N milliseconds). If the collision can be avoided, then the on-board computing device 212 may execute one or more control instructions to perform a cautious maneuver (e.g., mildly slow down, accelerate, change lane, or swerve)…” of ¶ [0040], ¶ [0084], and Fig. 2 elements 201-268, Fig. 3 steps 300-318).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Shah, TAKAKI with the teachings of Hartnett with a reasonable expectation of success to yield an enhanced, robust, and seamless system, method “ for an autonomous vehicle to accurately predict which trajectory a vehicle will follow.”, as taught in ¶ [0001].
Consider claims 2, 12:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 1, 11. In addition, Shah teaches wherein the processor is further configured to: generate a first predicted trajectory for a behavior of the nearby object based on the predicted position change of the nearby object (See Shah, e.g., “…an autonomous (or semi-autonomous) vehicle 102 in an environment 100, wherein a relevance polygon 104 of the autonomous vehicle 102 and a plurality of objects 106…the vehicle computing system determines the potential object interaction(s) based on a determination that an intersection between the predicted object trajectories 124 is within a threshold distance of one or more of the objects 106. In such examples, the vehicle computing system may determine that, absent a modification to one or more of the associated object trajectories 124, the objects 106 may collide…” of ¶ [0010]-¶ [0018], ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712); predict whether collision avoidance is possible for each of the plurality of avoidance behavior types using the first predicted trajectory for the behavior of the nearby object and an avoidance trajectory for each of the plurality of avoidance behavior types (See Shah, e.g., “…the vehicle computing system may determine one or more potential object interactions based on the one or more predicted trajectories 224…the potential object interactions may include potential collisions between objects 206…determine that a first object trajectory 224(1) associated with the first object 206(1) is substantially perpendicular to a second object trajectory 224(2) associated with the third object 206(3)…determine to modify the relevance polygon 204 to include the first object 206(1)…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3)…determine an action for the vehicle 102…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712); and determine a first avoidance behavior type predicted to be capable of collision avoidance among the plurality of avoidance behavior types as the avoidance behavior type of the vehicle (See Shah, e.g., “…the vehicle computing system may determine one or more potential object interactions based on the one or more predicted trajectories 224…the potential object interactions may include potential collisions between objects 206…determine that a first object trajectory 224(1) associated with the first object 206(1) is substantially perpendicular to a second object trajectory 224(2) associated with the third object 206(3)…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712).
Consider claims 3, 13:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 1, 11. In addition, Shah teaches wherein the processor is further configured to generate a final avoidance trajectory by changing an avoidance trajectory corresponding to the avoidance behavior type of the vehicle based on the surrounding environment information (“…At operation 420, the process 400 includes controlling the vehicle based on inclusion of data associated with the object. In various examples, the vehicle computing system may control the vehicle based on inclusion of the object by including the object data associated with the object in vehicle control planning considerations…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3) and/or to determine an action for the vehicle 102…” of Fig. 4 steps 414-424).
Consider claims 4, 14:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 3, 13. In addition, Shah teaches wherein the processor is further configured to: determine a maximum allowable distance for a lateral behavior of the vehicle, based on position information of the nearby object included in the nearby object information (See Shah, e.g., “…generate a relevance polygon based on a speed of the vehicle…represents an area proximate the vehicle in which objects may be relevant to the operation of the vehicle…include a length (e.g., longitudinal distance), a width (e.g., lateral distance), and/or a height (e.g., vertical distance) around the vehicle. The length, the width, and/or the height of the relevance polygon may be determined based on the speed of the vehicle. For example, as a speed of the vehicle increases, a length and a width of the relevance polygon may increase. Later, as the speed of the vehicle decreases, the length and the width of the relevance polygon may decrease…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712); and generate the final avoidance trajectory by changing the avoidance trajectory corresponding to the avoidance behavior type of the vehicle based on the maximum allowable distance for the lateral behavior (See Shah, e.g., “…the vehicle computing system may determine one or more potential object interactions based on the one or more predicted trajectories 224…the potential object interactions may include potential collisions between objects 206…determine that a first object trajectory 224(1) associated with the first object 206(1) is substantially perpendicular to a second object trajectory 224(2) associated with the third object 206(3)…determine to modify the relevance polygon 204 to include the first object 206(1)…include object data associated with the first object 206(1) in control planning considerations…determine a resulting object interaction between the objects 206(1) and 206(3)…determine an action for the vehicle 102…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712).
Consider claims 5, 15:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 4, 14. In addition, Shah teaches wherein, in response to there being another object in an avoidance direction corresponding to the avoidance behavior type of the vehicle (e.g., “…The vehicle computing system determines that detected objects within the relevance polygon are relevant to the vehicle, whereas detected objects outside the relevance polygon are irrelevant to the vehicle. Based on a determination of irrelevance, the vehicle computing system may withhold data associated with the irrelevant objects from vehicle control planning considerations…” of Figs. 1-2 elements 100-224), the processor is configured to limit the maximum allowable distance for the lateral behavior based on a position of the other object (See Shah, e.g., “…generate a relevance polygon based on a speed of the vehicle…represents an area proximate the vehicle in which objects may be relevant to the operation of the vehicle…include a length (e.g., longitudinal distance), a width (e.g., lateral distance), and/or a height (e.g., vertical distance) around the vehicle. The length, the width, and/or the height of the relevance polygon may be determined based on the speed of the vehicle. For example, as a speed of the vehicle increases, a length and a width of the relevance polygon may increase. Later, as the speed of the vehicle decreases, the length and the width of the relevance polygon may decrease…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712).
Consider claims 6, 16:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 4, 14. In addition, Shah teaches wherein the processor is configured to generate the final avoidance trajectory by further considering the road information (See Shah, e.g., “…the perception component 322 may provide processed sensor data that indicates a presence of an object (e.g., entity) that is proximate to the vehicle 302 and/or a classification of the object as an object type (e.g., car, pedestrian, cyclist, animal, building, tree, road surface, curb, sidewalk, unknown, etc.). In some examples, the perception component 322 may provide processed sensor data that indicates a presence of a stationary entity that is proximate to the vehicle 302 and/or a classification of the stationary entity as a type (e.g., building, tree, road surface, curb, sidewalk, unknown, etc.)…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0083]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712), and wherein the road information comprises one of or any combination of a curvature of a road on which the vehicle is traveling, a curvature change rate of the road on which the vehicle is traveling, and a slope of the road on which the vehicle is traveling (See Shah, e.g., “…the perception component 322 may provide processed sensor data that indicates a presence of a stationary entity that is proximate to the vehicle 302 and/or a classification of the stationary entity as a type (e.g., building, tree, road surface, curb, sidewalk, unknown, etc.)…” of ¶ [0020]-¶ [0023], ¶ [0025]-¶ [0029], ¶ [0038]-¶ [0048], ¶ [0051]-¶ [0053], ¶ [0068]-¶ [0071], ¶ [0085]-¶ [0086], ¶ [0121]-¶ [0126], and Figs. 1-2 elements 100-224, Figs. 4-7 steps 400-712).
Consider claims 7, 17:
The combination of Shah, TAKAKI, and Hartnett teaches everything claimed as implemented above in the rejection of claims 1, 11. TAKAKI teaches wherein the processor is further configured to: generate an image comprising a first predicted trajectory for the behavior of the nearby object based on the predicted position change of the nearby object (See TAKAKI, e.g., “…detects an object present ahead of a vehicle in a vehicle traveling direction based on an image acquired by an imaging section…” of Abstract, ¶ [0025]-¶ [0028], ¶ [0042]-¶ [0041], and Fig. 2 elements Ob, CS, Fig. 6 steps S11-S19); and generate images for each of the plurality of avoidance behavior types based on the generated image, wherein each of the images for each of the plurality of avoidance behavior types comprises a second predicted trajectory representing a relative behavior of the nearby object to the vehicle (See TAKAKI, e.g., “…detects an object present ahead of a vehicle in a vehicle traveling direction based on an image acquired by an imaging section, the vehicle control apparatus including: a position acquisition section that acquires, based on the image, positions of at least two specific points of the vehicle that are different in a lateral direction with respect to the vehicle traveling direction; a control section that performs collision avoidance control against the object based on a movement track of the object obtained from a history of the positions of the specific points; a calculation section that calculates, for each of the specific points, a movement direction of each of the specific points based on the history of the position of each of the specific points; and a changing section that changes how to perform the collision avoidance control based on a difference between the movement directions at the respective specific points…” of Abstract, ¶ [0025]-¶ [0028], ¶ [0042]-¶ [0041], and Fig. 2 elements Ob, CS, Fig. 6 steps S11-S19), and wherein the second predicted trajectory is determined based on the first predicted trajectory and an avoidance trajectory of the vehicle according to a corresponding avoidance behavior type (See TAKAKI, e.g., “…detects an object present ahead of a vehicle in a vehicle traveling direction based on an image acquired by an imaging section, the vehicle control apparatus including: a position acquisition section that acquires, based on the image, positions of at least two specific points of the vehicle that are different in a lateral direction with respect to the vehicle traveling direction; a control section that performs collision avoidance control against the object based on a movement track of the object obtained from a history of the positions of the specific points; a calculation section that calculates, for each of the specific points, a movement direction of each of the specific points based on the history of the position of each of the specific points; and a changing section that changes how to perform the collision avoidance control based on a difference between the movement directions at the respective specific points…” of Abstract, ¶ [0025]-¶ [0028], ¶ [0042]-¶ [0041], and Fig. 2 elements Ob, CS, Fig. 6 steps S11-S19).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify Shah with the teachings of TAKAKI with a reasonable expectation of success to yield “provide a vehicle control apparatus and a vehicle control method capable of preventing unnecessary operation while performing various types of control for improving traveling safety of a vehicle”, as taught in ¶ [0007].
Allowable Subject Matter
Claims 8-10, 18-20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Further, the prior art on record fails to teach or suggest, either in singularity or in combination, the claimed subject matter of claims 8-10, 18-20.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Havlak et al. (US Pat. No.: 11,970,164 B1) teaches “A vehicle computer system implements techniques to predict behavior of objects detected by a vehicle operating in the environment. The techniques include using a model to determine a first object trajectory for an object (e.g., a predicted object trajectory) and/or a potential object in an occluded area, as well as a second object trajectory for the object or potential object (e.g., an adverse object trajectory). The model is configured to use one or more algorithms, classifiers, and/or computational resources to predict candidate trajectories for the vehicle based on at least one of the first object trajectory or the second object trajectory. Based on the predicted behavior of the object (or potential object) and the predicted candidate trajectories for the vehicle, a vehicle computer system controls operation of the vehicle.”
Park et al. (US Pat. No.: 11,535,252 B2) teaches “The electronic apparatus includes a sensing unit including at least one sensor, a memory storing one or more instructions, and a processor configured to execute the one or more instructions stored in the memory to identify an object located near the vehicle, by using the at least one sensor, generate risk information of the object, the risk information including a type of the identified object, adjust a size of a bounding box generated to include at least a part of the identified object, based on the risk information of the object, and control a driving operation of the vehicle, based on the adjusted bounding box.”
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BABAR SARWAR whose telephone number is (571)270-5584. The examiner can normally be reached on Mon-Fri 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached on (313)446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free)? If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BABAR SARWAR/Primary Examiner, Art Unit 3667