DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 have been examined.
P = paragraph, e.g. p5 = paragraph 5.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Nagarajan et al. USPAP 2020/0356091, and further in view of Parekh et al. USPAP 2020/0201325.
As per claims 1, 9 and 19, Nagarajan discloses an autonomous vehicle comprising: a first sensor group configured to generate first sensor data;
a second sensor group configured to generate second sensor data;
processing logic configured to generate an operation instruction (p’s 45, 23, 56, 77, 36; figure 2) ; and a switch coupled between the processing logic and the first sensor group and the second sensor group, wherein the switch includes a prioritization engine configured to prioritize an order of transmission, from the switch to the processing logic, of the first sensor data over the second sensor data based on vehicle operation data received by the prioritization engine (p’s 48, 45, 63, 66-67; figure 3; p’s 77, 36; ab).
Nagarajan discloses via p45:
[0045] For example, when the AV 102 is travelling on the highway of the first route at the speed of 80 kmph on a sunny day, for maintaining the safety KPIs of the AV 102, the LiDAR sensor 204 may not be required to detect objects that are at a distance greater than 500 meters from the AV 102. Detecting the objects that are within 500 meters from the AV 102 corresponds to the optimal configuration for the LiDAR sensor 204 at that given point of time. If the LiDAR sensor 204 is operated at any other configuration under the given factors (i.e., the highway, the speed of 80 kmph, and the sunny day), either the safety KPIs are compromised or the power consumed by the LiDAR sensor 204 is increased relative to the optimal configuration. Thus, the AV configuration device 104 configures the LiDAR sensor 204 to be operational and detect the objects that are within 500 meters from the AV 102. Since the weather type is sunny, the visibility is high. In such a scenario, the AV configuration device 104 may select to switch-off the IR sensor 208 as the information provided by the IR sensor 208 is more useful when the visibility is low and switch-on the image acquiring module 210 to detect the presence of objects in the vicinity of the AV 102. The image acquiring module 210 may successfully capture images and detect objects due to high visibility. The AV configuration device 104 may be further configured to switch-off fog lights 216a and 216b (hereinafter, the fog lights 216a and 216b are collectively referred to and designated as “fog lights 216”) of the AV 102. Switching-off the IR sensor 208 and the fog lights 216, and turning on the image acquiring module 210, in this scenario, corresponds to the optimal configuration. Likewise, the AV configuration device 104 may select the optimal configuration for each component of the AV 102.
Nagarajan discloses all the limitations of the invention, however, arguendo, if Nagarajan is or might be interpreted such that it might not explicitly disclose prioritizing an order of transmission, of the first sensor data over the second sensor data, then Parekh discloses prioritizing an order of transmission, of the first sensor data over the second sensor data (p’s 21, 25, 4-5, 47-52, 57-58, 60-61; fig’s 6, 5, 4, 3). If this interpretation is taken, then it would have been obvious, before the effective filing date of the claimed invention, to modify Nagarajan to include prioritizing an order of transmission, of the first sensor data over the second sensor data such as that taught by Parekh in order to (i) analyze the one or more data streams, (ii) determine a current operation status of a vehicle based on the one or more data streams, (iii) determine whether or not the current operation status of the vehicle triggers prioritized processing of the one or more data streams, and (iv) upon determination that the prioritized
processing is triggered, apply the prioritized processing (Parekh, p4).
As per claims 2, 3 and 10, Nagarajan discloses wherein the processing logic is configured to generate an operation instruction for operating the autonomous vehicle based at least in part on the first sensor data that was prioritized over the second sensor data (p’s 45, 23, 48, 56; figure 2; p’s 63, 66-67, 77, 36; figure 3) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p48:
[0048] In another example, the AV 102 may be travelling on the urban road of the first route on a foggy day. In this scenario, the image acquiring module 210 may not be able to capture clear images and detect objects due to poor visibility. Thus, under the given factors (i.e., the urban road and the foggy day), the optimal configuration selected by the AV configuration device 104 would be to switch-off the image acquiring module 210 and switch-on the IR sensor 208 and the fog lights 216. Further, during the foggy weather, the information provided by the LiDAR sensor 204 may not be useful, thus the optimal configuration selected by the AV configuration device 104 further includes switching-off the LiDAR sensor 204. Based on the selected optimal configuration, the AV configuration device 104 may switch-on the IR sensor 208 and the fog lights 216, and switch-off the image acquiring module 210 and the LiDAR sensor 204.
As per claim 4, Nagarajan discloses a control system configured to control the autonomous vehicle based on operation instruction generated by the processing logic (p’s 48, 56; figure 2; p’s 63, 66-67, 77, 36; figure 3; p’s 45, 23) as per the discussion above.
As per claims 5 and 11, Nagarajan discloses wherein the first sensor group and the second sensor group are configured to detect one or more objects in an external environment of the autonomous vehicle (figure 2; p’s 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p23:
[0023] The AV configuration device 104 may be configured to process and analyze the acquired information. Based on the analysis of the acquired information, the AV configuration device 104 may be configured to select a first configuration for operating the components of the AV 102. The AV configuration device 104 may utilize the acquired information to select the first configuration (hereinafter, the first configuration is referred to as “an optimal configuration”) from a set of configurations associated with the components of the AV 102. The set of configurations associated with the components may refer to various functional settings of the components. In one example, the set of configurations for the IR sensor of the AV 102 may include configuring the IR sensor to be operational or in-operational. In another example, the set of configurations for the LiDAR sensor of the AV 102 may include configuring the LiDAR sensor to be operational or in-operational and detecting objects within various distance ranges, for example, 500 meters, 200 meters, or the like. The optimal configuration for a component of the AV 102, under the given factors, may correspond to that functional setting of the component at which the component consumes minimum power while meeting safety key performance indicators (KPIs) of the AV 102. Examples of the safety KPIs may include, but are not limited to, a minimum accepted gap at intersections or in lane changes, maximum jerk intensity, maximum longitudinal acceleration and deceleration, or detecting objects of specific sizes and at specific distances. For example, when the AV 102 is travelling on a highway at a speed of 50 kilometers per hour (kmph) on a sunny day, for maintaining the safety KPIs of the AV 102, the LiDAR sensor of the AV 102 may not be required to detect objects that are at a distance greater than 500 meters from the AV 102. Detecting the objects that are within 500 meters from the AV 102 corresponds to the optimal configuration at that given point of time. If the LiDAR sensor is operated at any other configuration under the given factors (i.e., the highway, the speed of 80 kmph, and the sunny day), either the safety KPIs are compromised or the power consumed by the LiDAR sensor is increased relative to the optimal configuration. In another example, the AV 102 may be travelling on an urban route at a speed of 50 kmph on a sunny day. In this scenario, the LiDAR sensor may not be required to detect objects that are at a distance greater than 200 meters from the AV 102. Thus, the optimal configuration for the LiDAR sensor under the given factors (i.e., the urban route, the speed of 50 kmph, and the sunny day) is to detect objects that are within 200 meters from the AV 102. Thus, the optimal configuration for a component of the AV 102 is a function of the factors associated with the AV 102. The AV configuration device 104 may configure the components of the AV 102 to operate at the selected optimal configuration. As the AV configuration device 104 ensures that the components of the AV 102 are not always operated at the corresponding highest configuration, a durability of the components is increased.
As per claim 6, Nagarajan discloses wherein the vehicle operation data is representative of a state of the autonomous vehicle (p’s 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23; figure 2) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p63:
[0063] In one embodiment, instead of the AV configuration device 104, the application server 108 may acquire the information pertaining to the factors associated with the AV 102. In operation, the processor 402 may be configured to acquire the information pertaining to the factors associated with the AV 102 by way of the transceiver 406. The processor 402 may acquire the information from the AV 102, the first and second vehicles 106a and 106b, the database server 110, and the third-party server. The acquired information may be then stored in the memory 404 for further processing. Based on the acquired information, the processor 402 may be configured to select the optimal configuration for operating the components. For example, the processor 402 may select to keep only those components operational which are sufficient to run the AV 102 under the given factors and may select to switch-off the components that are not required to be operational for meeting the safety KPIs. The transceiver 406 may be configured to transmit the optimal configuration of the components to the AV configuration device 104. When the AV configuration device 104 receives the optimal configuration, the AV configuration device 104 may configure the components to operate at the optimal configuration. When the components of the AV 102 operate at the optimal configuration, the power consumed by the components may be minimum while meeting the safety KPIs. As a result, the durability of the components may be increased.
As per claims 7 and 13, Nagarajan discloses wherein the vehicle operation data includes a transmission-gear of the autonomous vehicle (figure 3; p’s 48, 56, 45, 23; figure 2; p’s 63, 66-67, 77, 36) as per the discussion above.
As per claims 8 and 14, Nagarajan discloses wherein the vehicle operation data includes a speed of the autonomous vehicle (p’s 48, 56, 45, 23; figure 2; p’s 63, 66-67, 77, 36; figure 3) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p66:
[0066] Based on the acquired information, the AV configuration device 104 may select the optimal configuration for the components of the AV 102. The components may include the location sensor 202, the LiDAR sensor 204, the RADAR sensor 206, the IR sensor 208, the image acquiring module 210, the ultrasonic sensor 212, the fog lights 216 or the like. The AV configuration device 104 may then configure the components of the AV 102 at the selected optimal configuration. For example, once the navigation details of the first route are obtained, the AV configuration device 104 may switch-off the location sensor 202 to reduce the power consumed by the location sensor 202. Further, as the AV 102 is travelling on the first road R1 at the time when the visibility is high, the AV configuration device 104 may configure the LiDAR sensor 204 to detect objects within 500 meters and may not turn on the IR sensor 208, the head lights 218, and the tail lights 220. Further, the AV configuration device 104 may configure the speed controller 214 to limit a speed of the AV 102 to 80 kmph (i.e., the speed constraint of the first road R1). The AV configuration device 104 may continue to acquire the information pertaining to the factors associated with the AV 102 and if any of the factors change, the AV configuration device 104 may select a new optimal configuration for operating the components of the AV 102.
As per claim 12, Nagarajan discloses wherein the vehicle operation data is received from a vehicle data bus of the autonomous vehicle (p54, figure 2; p’s 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p54:
[0054] FIG. 3 is a block diagram that illustrates the AV 102, in accordance with an embodiment of the disclosure. The AV configuration device 104 may communicate with the components (e.g., the location sensor 202, the LiDAR sensor 204, the RADAR sensor 206, the IR sensor 208, the image acquiring module 210, the ultrasonic sensor 212, the speed controller 214, the fog lights 216, the head lights 218, the tail lights 220, the entertainment system 222, and the air conditioner system 224) of the AV 102 by way of a first communication bus 300. The AV configuration device 104 may include a processor 302, a configuration controller 304, a memory 306, and a transceiver 308. The processor 302, the configuration controller 304, the memory 306, and the transceiver 308 may communicate with each other by way of a second communication bus 310.
As per claim 15, Nagarajan discloses wherein the vehicle operation data indicates a rear-ward direction of the autonomous vehicle (p’s 37, 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23; figure 2) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via figure 2:
PNG
media_image1.png
685
372
media_image1.png
Greyscale
As per claim 16, Nagarajan discloses wherein the first sensor data is generated by rear sensors of the autonomous vehicle disposed to detect or image a rear-ward area of the autonomous vehicle (p37, figure 3; p’s 48, 56, 45, 23; figure 2; p’s 63, 66-67, 77, 36) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p37:
[0037] The ultrasonic sensor 212 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to determine a distance of an object from the AV 102. For determining the distance of the object from the AV 102, the ultrasonic sensor 212 may be configured to emit ultrasonic sound waves in the direction of the object. The object may reflect the ultrasonic sound waves emitted by the ultrasonic sensor 212. The ultrasonic sensor 212 may be configured to determine the distance of the object from the AV 102 based on the reflected ultrasonic sound waves. In one example, as shown in FIG. 2, the ultrasonic sensor 212 may be positioned at the backside of the AV 102. In such a scenario, the ultrasonic sensor 212 detects the distance of the object that is located behind the AV 102. It will be apparent to a person of skill in the art that the ultrasonic sensor 212 may be positioned at a different place in the AV 102 and that the AV 102 may include more than one ultrasonic sensor without deviating from the scope of the disclosure.
As per claim 17, Nagarajan discloses wherein the vehicle operation data indicates a forward direction of the autonomous vehicle (p’s 35, 48, 56, 45, 23; figure 2; p’s 63, 66-67, 77, 36; figure 3) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p35:
[0035] The IR sensor 208 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to detect objects during low visibility. For detecting the objects during low visibility, the IR sensor 208 may be configured to measure heat (i.e., IR radiations) emitted by the objects. In one example, as shown in FIG. 2, the IR sensor 208 may be positioned in front of the AV 102 facing a direction of travel of the AV 102. In such a scenario, the IR sensor 208 detects the objects that are located in front of the AV 102. It will be apparent to a person of skill in the art that the IR sensor 208 may be positioned at a different place in the AV 102 and that the AV 102 may include more than one IR sensor without deviating from the scope of the disclosure.
As per claim 18, Nagarajan discloses wherein the first sensor data is generated by front sensors of the autonomous vehicle disposed to detect or image a frontside area of the autonomous vehicle (figure 2; p’s 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23) as per the discussion above.
As per claim 20, Nagarajan discloses wherein the operation instruction generated by the processing logic is received by the control system via the switch (p’s 63, 66-67, 77, 36; figure 3; p’s 48, 56, 45, 23; figure 2) as per the discussion above and the rejection of corresponding parts of the claims above incorporated herein and further, Nagarajan discloses via p56:
[0056] The configuration controller 304 may include suitable logic, circuitry, interfaces, and/or code, executable by the circuitry, that may be configured to operate the components of the AV 102 at the optimal configuration selected by the processor 302. The configuration controller 304 may configure the components to operate at the optimal configuration in real time. For example, when the processor 302 determines that the LiDAR sensor 204 is not required to be operational, the configuration controller 304 switches-off the LiDAR sensor 204 by way of the second communication bus 310. Likewise, the configuration controller 304 may configure the other components of the AV 102 based on the optimal configuration selected by the processor 302. When the components of the AV 102 operate at the optimal configuration, the power consumed by the components under the given factors may be minimum while meeting the safety KPIs. As a result, the durability (i.e., the lifespan) of the components may also be increased. A processing power required by the processor 302 may also be reduced when one or more of the components are switched-off or operated at a lower configuration.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Umezawa et al. (U.S. patent application publication 2013/0088578) discloses an image processing apparatus including stereo cameras and detects a distance to a target around a vehicle. The image processing apparatus includes sensors installed in the vehicle; determines a vehicle situation by combining detection signals of the sensors and switches from image data of a stereo camera for driving assistance to image data of a stereo camera for door opening/closing assistance; carries out distortion corrections that are unique to the plural stereo cameras respectively on the selected image data so that correlation calculation common to the plural stereo cameras can be carried out; carries out the correlation calculation on two sets of image data of the single stereo camera; and detects the distance to the target by using a calculation result of the correlation calculation.
Kroop et al. (U.S. patent application publication 2018/0257643) discloses a self-driving vehicle which can operate by analyzing a live sensor view to autonomously operate acceleration, braking, and steering systems of the SDV along a current route. Based on a detected anomaly associated with the live sensor view, the SDV can transmit a teleassistance inquiry to a backend transport system in accordance with a data prioritization scheme. The SDV can receive a resolution response from the backend transport system based on the teleassistance inquiry, and proceed in accordance with the resolution response.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BEHRANG BADII whose telephone number is 571-272-6879. The examiner can normally be reached on Monday-Friday.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at 571-272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Any response to this action should be mailed to:
Mail Stop Amendment
Commissioner for Patents
P.O. Box 1450
Alexandria, VA 22313-1450
or faxed to (571)273-8300
Hand delivered responses should be brought to
United States Patent and Trademark Office
Customer Service Window
Randolph Building
401 Dulany Street
Alexandria, VA 22314
Any inquiry of a general nature or relating to the status of this application or proceeding should be directed to the Technology Center 3600 Customer Service Office whose telephone number is (571) 272-3600.
/Behrang Badii/
Primary Examiner
Art Unit 3665