DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Claims 1, 8, and 15 have been amended, introducing new limitations. Claims 4, 7, 11, 14, 18, and 20 have been amended to address minor informalities.
No claims have been canceled.
No new claims have been introduced.
Claims 1-20 are currently pending.
The official correspondence below is an after non-final.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shoemaker (US 20240286609 A1) in view of Rahimpour (US 20220306088 A1) and in further view of Takahashi (JP 2008282097 A).
REGARDING CLAIM 1, Shoemaker discloses, identifying an animal in an environment of a vehicle (Shoemaker: [0055] If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided (examiner: maneuver based upon type implies identification)) based on sensor data obtained by one or more sensors mounted to an exterior of the vehicle (Shoemaker: [0016]; [0041] The artificial intelligence model may be configured to ingest data from at least one sensor of the autonomous vehicle and predict the attributes of the object), the animal proximate to a planned path for the vehicle (Shoemaker: [0055] If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided (examiner: maneuver based upon type implies identification)); classifying the animal based on the sensor data (Shoemaker: [0041] The artificial intelligence model may be configured to ingest data from at least one sensor of the autonomous vehicle and predict the attributes of the object; [0055] If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided (examiner: maneuver based upon type implies identification)); predicting a future pathway of the animal (Shoemaker: [ABS] an indication that a current trajectory of an autonomous vehicle is associated with a likelihood of a collision that satisfies a collision threshold indicating a potential collision with an animal having an attribute that satisfies a threshold; [0050] As the process is executed, the processor may determine a likelihood of a collision with a target object (animal). If the likelihood satisfies a threshold, the processor may take another action, such as steps 420, 430, 440, 450. After each step, the processor may evaluate the likelihood again to determine if the threshold is still satisfied, thereby causing another action to urge the animal to move or take evasive action. Once the likelihood no longer satisfies that threshold for a collision, the autonomous vehicle may proceed along the trajectory (examiner: predicting the animal will not move without being urged is per se predicting a future path); [0055] and predict the trajectory of the objects in order to determine if a collision is imminent (within a predetermined time window). If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided); selecting an action for the vehicle to perform based on the classification of the animal (Shoemaker: [0055] If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided (examiner: maneuver based upon type implies identification)) and the predicted future pathway (Shoemaker: [ABS] an indication that a current trajectory of an autonomous vehicle is associated with a likelihood of a collision that satisfies a collision threshold indicating a potential collision with an animal having an attribute that satisfies a threshold; [0050] As the process is executed, the processor may determine a likelihood of a collision with a target object (animal). If the likelihood satisfies a threshold, the processor may take another action, such as steps 420, 430, 440, 450. After each step, the processor may evaluate the likelihood again to determine if the threshold is still satisfied, thereby causing another action to urge the animal to move or take evasive action. Once the likelihood no longer satisfies that threshold for a collision, the autonomous vehicle may proceed along the trajectory (examiner: predicting the animal will not move without being urged is per se predicting a future path); [0055] and predict the trajectory of the objects in order to determine if a collision is imminent (within a predetermined time window). If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided), the action selected to enable the vehicle to traverse the planned path (Shoemaker: [0018] the truck 102 can plan and execute maneuvers and/or routes with respect to the features of the digital map. The behaviors, planning, and control aspects of the autonomy system 150 may be configured to make decisions about how the truck 102 should move through the environment to get to its goal or destination); performing the selected action by the vehicle (Shoemaker: [0005] methods and systems for optimizing path planning and autonomous vehicle operation when facing animals on the road; [0038] configured to use one or more system inputs to identify, evaluate, and modify a vehicle trajectory); obtaining additional sensor data (Shoemaker: [0016] various sensors (e.g., LiDAR, camera, radar, etc.) of the autonomy system 150 may identify one or more objects (e.g., pedestrians, vehicles, debris, etc.) and features of the roadway; [0030] the various sensors, such as camera system 220, LiDAR system 222, GNSS receiver 208, and/or IMU 224 (collectively “perception data”) to sense an environment surrounding the truck 260 and interpret it), the sensor data including data describing the classified animal (Shoemaker: [0054] different sensors, such as cameras, Lidar, and radar, to detect and identify the target objects resemble an animal … mass … speed … Using the data gathered, the processor may determine that the target object is an animal); and determining, based on the additional sensor data, that the vehicle can traverse the planned path (Shoemaker: [0037] determined its location with respect to map features (e.g., intersections, road signs, lane lines, etc.) the truck 260 may use the vehicle control module 206 and its associated systems to plan and execute maneuvers and/or routes with respect to the features of the environment. The vehicle control module 206 may make decisions about how the truck 260 will move through the environment to get to its goal or destination as it completes its mission).
Shoemaker discloses classifying the animal based on the sensor data. Shoemaker does not explicitly disclose, classifying the animal based on the sensor data and additional data describing the environment of the vehicle.
However, in the same field of endeavor, Rahimpour discloses, and additional data describing the environment of the vehicle (Rahimpour: [ABS] identification of an object based on the ambient air temperature; [0031] identifying the current location of the vehicle as one of an urban environment or a rural environment, outputting the identification of the object as a pedestrian when the current location is the urban environment, and outputting the identification of the object as an animal when the current location is a rural environment; [0055] ... trained to identify specific objects in an urban environment ... as a pedestrian or an animal that typically lives in an urban environment (e.g., a pigeon, a dog, etc.) ... an animal that typically lives in a rural environment (e.g., a deer, a fox, etc.) when the current location is the rural environment), for the benefit of identifying an object and the risk of collision with the object.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by Shoemaker to include environment clues taught by Rahimpour. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to identify an object and the risk of collision with the object.
Shoemaker, as modified, does not explicitly disclose, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path.
However, in the same field of endeavor, Takahashi discloses, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path (Takahashi: [0071] the possibility of the detected pedestrian entering the vehicle path and colliding with the vehicle is predicted from the crossing intention estimated in step 106 and the position of the barrier detected in step 108. Then, the collision risk with the own vehicle is calculated; [0076] If there is a barrier in the direction in which the pedestrian translates on the road, it is assumed that the pedestrian protrudes along the barrier edge. Based on the above assumption, the collision probability that a physical collision occurs is calculated. At this time, the collision probability between the pedestrian jumping ahead of the vehicle and the vehicle depends on the relative position and relative speed between the vehicle and the pedestrian), for the benefit of estimating a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by a modified Shoemaker to include determining a collision risk level in the presence of a barrier taught by Takahashi. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to estimate a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
REGARDING CLAIM 2, Shoemaker, as modified, remains as applied above to claim 1, and further, Shoemaker also discloses, the additional data describing the environment of the vehicle comprises at least one of a current time of day and a location of the vehicle (Shoemaker: [0026] the IMU 224 may be communicatively coupled to the GNSS receiver 208 and/or the mapping/localization module 204, to help determine a real-time location of the truck 200, and predict a location of the truck; [0072] disables headlights (if it is night time) (examiner: implies time awareness, time stamping event)).
REGARDING CLAIM 3, Shoemaker, as modified, remains as applied above to claim 2, and further, Rahimpour also discloses, retrieving a list of potential animals in the environment of the vehicle (Rahimpour: [0055] The machine learning program can output the identification of the object 200 as a pedestrian or an animal that typically lives in an urban environment (e.g., a pigeon, a dog, etc.) when the current location is the urban environment and as an animal that typically lives in a rural environment (e.g., a deer, a fox, etc.) when the current location is the rural environment) based on the additional data (Rahimpour: [0055] based on the current location); and classifying the animal based on the list of potential animals (Rahimpour: [0055] For each identification, the urban environment can have a predetermined threshold and the rural environment can have a respective predetermined threshold. The thresholds can be different, e.g., the threshold for identifying a deer in the urban environment can be greater than the threshold for identifying the deer in the rural environment because deer may be more commonly found in rural environments than urban environments, and the higher threshold for the urban environment can reduce a likelihood of a false positive identification of the deer in the urban environment).
REGARDING CLAIM 4, Shoemaker, as modified, remains as applied above to claim 1, and further, Shoemaker also discloses, inputting at least a portion of the sensor data (Shoemaker: [0041] the artificial intelligence model is a predictive machine learning model that may be continuously trained using updated data, e.g., relative velocity data, mass attribute data, and target objects classification data) and at least a portion of the additional data (Shoemaker: [0017] The maps/localization aspect of the autonomy system 150 may be configured to determine where on a pre-established digital map the truck 102 is currently located. One way to do this is to sense the environment surrounding the truck 102 (e.g., via the perception system) and to correlate features of the sensed environment with details (e.g., digital representations of the features of the sensed environment) on the digital map (examiner: localization is updated data, see [0041])) into a machine-trained classification model (Shoemaker: [0041] the artificial intelligence model is a predictive machine learning model that may be continuously trained using updated data, e.g., relative velocity data, mass attribute data, and target objects classification data); and receiving data indicating a type of the animal from the machine-trained classification model (Shoemaker: [0016] a perception module or engine in the autonomy system 150 of the truck 102 may identify and classify objects or groups of objects in the environment; [0032] system may identify and classify various features detected in the collected perception data from the environment with the features stored in a digital map; [0041] the artificial intelligence model is a predictive machine learning model that may be continuously trained using updated data, e.g., relative velocity data, mass attribute data, and target objects classification data).
REGARDING CLAIM 5, Shoemaker, as modified, remains as applied above to claim 1, and further, Shoemaker also discloses, the action is selected from a set of vehicle actions comprising moving the vehicle towards the animal, moving the vehicle away from the animal, and moving the vehicle in a path around the animal (Shoemaker: [0050] If the likelihood satisfies a threshold, the processor may take another action, such as steps 420, 430, 440, 450. After each step, the processor may evaluate the likelihood again to determine if the threshold is still satisfied, thereby causing another action to urge the animal to move or take evasive action. Once the likelihood no longer satisfies that threshold for a collision, the autonomous vehicle may proceed along the trajectory; [0055] The processor may use various methods discussed herein (e.g., computer vision and/or machine learning algorithms) to analyze the data from the sensors and predict the trajectory of the objects in order to determine if a collision is imminent (within a predetermined time window). If the vehicle determines that a collision is likely, it will take evasive action to avoid the collision. As discussed throughout, the evasive actions may be customized for animals, such that a collision is avoided).
REGARDING CLAIM 6, Shoemaker, as modified, remains as applied above to claim 1, and further, Shoemaker also discloses, the action comprises making a sound at a specific frequency, the specific frequency selected based on the classification of the animal (Shoemaker: [0060] the processor may instruct a horn to output a loud noise for a duration of time, such as for a second, two seconds, or five seconds. The sound outputted by the autonomous vehicle may be a predetermined sound (e.g., whistle) that corresponds to the animal detected; [0055] discussed throughout, the evasive actions may be customized for animals; [0072]).
REGARDING CLAIM 7, Shoemaker, as modified, remains as applied above to claim 1, and further, Shoemaker also discloses, transmitting data describing the animal (Shoemaker: [0034] perform environmental mapping and/or track object vectors (e.g., speed and direction) ... objects or features may be classified into various object classes using the image classification function, for instance, and the computer vision function may track the one or more classified objects to determine aspects of the classified object (e.g., aspects of its motion, size, etc.); [0035] A centralized mapping system may be accessible via network 260 for updating the digital map(s) ... the truck 200 and other vehicles (e.g., a fleet of trucks similar to the truck 200) can generate, maintain (e.g., update), and use their own generated maps when conducting a mission; [0054]) and a location of the vehicle or the animal to a fleet management system (Shoemaker: see [0062] animal existing on current route), wherein the fleet management system instructs a second vehicle in a fleet to select a route that avoids the animal (Shoemaker: [0035] discloses updating a central map for a fleet; [0043] discloses a cost map associated with the central map; and [0062] discloses "The processor may identify a plurality of alternative trajectories for the autonomous vehicle. Each alternative trajectory may change the direction of the trajectory for the autonomous vehicle. Non-limiting examples of the alternative trajectory may include changing lanes to avoid the animal or exiting the current route (e.g., existing a highway to avoid a potential collision with the animal)").
REGARDING CLAIM 8, Shoemaker discloses, a sensor suite comprising a plurality of sensors mounted to an exterior of the vehicle (Shoemaker: [0016]) and to obtain sensor data describing an environment of the vehicle (Shoemaker: [0016]); and processing circuitry to: identify an animal in the environment of the vehicle based on the sensor data (Shoemaker: [0055]), the animal proximate to a planned path for the vehicle (Shoemaker: [0055]); classify the animal based on the sensor data (Shoemaker: [0041]; [0055]) predict a future pathway of the animal (Shoemaker: [ABS]; [0050]; [0055]) select an action for the vehicle to perform based on the classification of the animal (Shoemaker: [0055]) and the predicted future pathway (Shoemaker: [ABS]; [0050]; [0055]), the action selected to enable the vehicle to traverse the planned path (Shoemaker: [0018]); instruct the vehicle to perform the selected action (Shoemaker: [0005]; [0038]); receive additional sensor data from the sensor suite (Shoemaker: [0016]; [0030]), the sensor data including data describing the classified animal (Shoemaker: [0054]); and determine, based on the additional sensor data, that the vehicle can traverse the planned path (Shoemaker: [0037]).
Shoemaker discloses classifying the animal based on the sensor data. Shoemaker does not explicitly disclose, classifying the animal based on the sensor data and additional data describing the environment of the vehicle.
However, in the same field of endeavor, Rahimpour discloses, and additional data describing the environment of the vehicle (Rahimpour: [ABS]; [0031]; [0055]), for the benefit of identifying an object and the risk of collision with the object.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by Shoemaker to include environment clues taught by Rahimpour. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to identify an object and the risk of collision with the object.
Shoemaker, as modified, does not explicitly disclose, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path.
However, in the same field of endeavor, Takahashi discloses, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path (Takahashi: [0071]; [0076]), for the benefit of estimating a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by a modified Shoemaker to include determining a collision risk level in the presence of a barrier taught by Takahashi. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to estimate a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
REGARDING CLAIM 9, Shoemaker, as modified, remains as applied above to claim 8, and further, Shoemaker also discloses, the additional data describing the environment of the vehicle comprises at least one of a current time of day and a location of the vehicle (Shoemaker: [0026]; [0072]).
REGARDING CLAIM 10, Shoemaker, as modified, remains as applied above to claim 9, and further, Rahimpour also discloses, retrieving a list of potential animals in the environment of the vehicle (Rahimpour: [0055]) based on the additional data (Rahimpour: [0055]); and classifying the animal based on the list of potential animals (Rahimpour: [0055]).
REGARDING CLAIM 11, Shoemaker, as modified, remains as applied above to claim 8, and further, Shoemaker also discloses, inputting at least a portion of the sensor data (Shoemaker: [0041]) and at least a portion of the additional data (Shoemaker: [0017] (examiner: localization is updated data, see [0041])) into a machine-trained classification model (Shoemaker: [0041]); and receiving data indicating a type of the animal from the machine-trained classification model (Shoemaker: [0016]; [0032]; [0041]).
REGARDING CLAIM 12, Shoemaker, as modified, remains as applied above to claim 8, and further, Shoemaker also discloses, the action is selected from a set of vehicle actions comprising moving the vehicle towards the animal, moving the vehicle away from the animal, and moving the vehicle in a path around the animal (Shoemaker: [0050]; [0055]).
REGARDING CLAIM 13, Shoemaker, as modified, remains as applied above to claim 8, and further, Shoemaker also discloses, outputting a sound from the audio device at a specific frequency, the specific frequency selected based on the classification of the animal (Shoemaker: [0060]; [0055]; [0072]).
REGARDING CLAIM 14, Shoemaker, as modified, remains as applied above to claim 8, and further, Shoemaker also discloses, transmitting data describing the animal (Shoemaker: [0034]; [0035]; [0054]) and a location of the vehicle or the animal to a fleet management system (Shoemaker: see [0062] animal existing on current route), wherein the fleet management system instructs a second vehicle in a fleet to select a route that avoids the animal (Shoemaker: [0035]; [0043]; and [0062]).
REGARDING CLAIM 15, Shoemaker discloses, identify an animal in an environment of a vehicle (Shoemaker: [0055]) based on sensor data obtained by one or more sensors mounted to an exterior of the vehicle (Shoemaker: [0016]; [0041]), the animal proximate to a planned path for the vehicle (Shoemaker: [0055]); classify the animal based on the sensor data (Shoemaker: [0041]; [0055]) predict a future pathway of the animal (Shoemaker: [ABS]; [0050]; [0055]) select an action for the vehicle to perform based on the classification of the animal and the predicted future pathway (Shoemaker: [ABS]; [0050]; [0055]), the action selected to enable the vehicle to traverse the planned path (Shoemaker: [0018]); perform the selected action by the vehicle (Shoemaker: [0005]; [0038]); obtain additional sensor data (Shoemaker: [0016]; [0030]), the sensor data including data describing the classified animal (Shoemaker: [0054]); and determine, based on the additional sensor data, that the vehicle can traverse the planned path (Shoemaker: [0037]).
Shoemaker discloses classifying the animal based on the sensor data. Shoemaker does not explicitly disclose, classifying the animal based on the sensor data and additional data describing the environment of the vehicle.
However, in the same field of endeavor, Rahimpour discloses, and additional data describing the environment of the vehicle (Rahimpour: [ABS]; [0031]; [0055]), for the benefit of identifying an object and the risk of collision with the object.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by Shoemaker to include environment clues taught by Rahimpour. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to identify an object and the risk of collision with the object.
Shoemaker, as modified, does not explicitly disclose, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path.
However, in the same field of endeavor, Takahashi discloses, predicting a future pathway of the animal based on a likelihood the animal is to cross a physical barrier detected between the animal and the planned path (Takahashi: [0071]; [0076]), for the benefit of estimating a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method disclosed by a modified Shoemaker to include determining a collision risk level in the presence of a barrier taught by Takahashi. One of ordinary skill in the art would have been motivated to make this modification, with a reasonable expectation of success, in order to estimate a collision risk degree with high reliability and a support apparatus capable of supporting an operation to avoid collision.
REGARDING CLAIM 16, Shoemaker, as modified, remains as applied above to claim 15, and further, Shoemaker also discloses, the additional data describing the environment of the vehicle comprises at least one of a current time of day and a location of the vehicle (Shoemaker: [0026]; [0072]).
REGARDING CLAIM 17, Shoemaker, as modified, remains as applied above to claim 16, and further, Rahimpour also discloses, retrieving a list of potential animals in the environment of the vehicle (Rahimpour: [0055]) based on the additional data (Rahimpour: [0055]); and classifying the animal based on the list of potential animals (Rahimpour: [0055]).
REGARDING CLAIM 18, Shoemaker, as modified, remains as applied above to claim 15, and further, Shoemaker also discloses, inputting at least a portion of the sensor data (Shoemaker: [0041]) and at least a portion of the additional data (Shoemaker: [0017]) into a machine-trained classification model (Shoemaker: [0041]); and receiving data indicating a type of the animal from the machine-trained classification model (Shoemaker: [0016]; [0032]; [0041]).
REGARDING CLAIM 19, Shoemaker, as modified, remains as applied above to claim 15, and further, Shoemaker also discloses, the action is selected from a set of vehicle actions comprising moving the vehicle towards the animal, moving the vehicle away from the animal, and moving the vehicle in a path around the animal (Shoemaker: [0050]; [0055]).
REGARDING CLAIM 20, Shoemaker, as modified, remains as applied above to claim 15, and further, Shoemaker also discloses, transmit data describing the animal (Shoemaker: [0034]); [0035]; [0054]) and a location of the vehicle or the animal to a fleet management system (Shoemaker: see [0062] animal existing on current route), wherein the fleet management system instructs a second vehicle in a fleet to select a route that avoids the animal (Shoemaker: [0035]; [0043]; and [0062]).
Response to Arguments
Applicant’s arguments, beginning on page 7, and submitted 08-19-2025, with respect to the rejection of the independent claim(s) under 35 USC §103, obviousness, have been considered but are moot because the new ground of rejection does not rely on the reference combination applied in the prior rejection of record for matter specifically challenged in the argument.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
STUDER (DE 102022002339 A1) - Method For Configuring And Operating A Wildlife Crossing Warning System Of A Vehicle
GIBANICA (EP 4407583 A1) - SAFETY SYSTEM FOR A VEHICLE FOR PROTECTING A VEHICLE OCCUPANT AND WILDLIFE, TRAINING MODULE, VEHICLE COMPRISING A SAFETY SYSTEM, USE OF SAFETY SYSTEM AND COMPUTER-IMPLEMENTED METHOD USING A SAFETY SYSTEM IN A VEHICLE
Pohl (US 20190225214 A1) - ADVANCED WILD-LIFE COLLISION AVOIDANCE FOR VEHICLES
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to AARRON SANTOS whose telephone number is (571)272-5288. The examiner can normally be reached Monday - Friday: 8:00am - 4:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ANGELA ORTIZ can be reached at (571) 272-1206. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.S./Examiner, Art Unit 3663
/ANGELA Y ORTIZ/Supervisory Patent Examiner, Art Unit 3663