DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Status of Claims
Claims 1-20 are pending and have been examined below.
Claim Objections
Claim 11 is objected to because of the following informalities, for which correction is required:
“the at least one vehicle” lacks antecedent basis, which is presumed to refer to the previously stated “An autonomous vehicle”.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 USC 112(b):
(B) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 USC 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 11-18 are rejected under 35 USC 112(b) or 35 USC 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claim 11
The term/phrase “processor (e.g., cloud-based processing system) in electronic” renders the claim indefinite. Stating the processor is, for example, a cloud-based processing system results in the metes and bounds of the claim being unclear to one of ordinary skill in the art. Correction is required.
Claims dependent on the above claims do not remedy their deficiencies, so they are rejected for similar reasons.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 USC 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-4, 7-14, 17 and 18 are rejected under 35 USC 102 as being anticipated by WO2018035145 (“Kutila”).
Claim 1
Kutila discloses an autonomous vehicle system (0004 Methods and systems disclosed herein are used to support autonomous vehicle platoons for a platoon structure reconfiguration.) comprising:
a plurality of autonomous vehicles (AVs) in electronic communication with one another (0044 A communications unit 428 is used to receive and transmit data with platooning vehicles), each AV comprising:
at least one sensing device (0045 The lead vehicle may have the best visual view for events in front of the platoon. Video from the lead vehicle may be displayed on a see-through video display module within each vehicle in the platoon. For some embodiments, when a new leader (or supervisor) is determined to lead a platoon, the see-through video display may switch to another vehicle., 0070 For some embodiments, information transmitted to the lead vehicle by other vehicles may include sensor data or video data recorded by another vehicle. For other embodiments, the lead vehicle may receive environment perception system data or vehicle computer system data from other vehicles in the platoon. The lead vehicle may determine the most accurate measurement data (including sensor and video data) or the vehicle using the best object detection algorithms and use data under those scenarios., 0057 A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle., 0064 sensor data (such as location of the detected object) and video data from the front vehicle are transmitted to a new lead vehicle (based on a new platoon configuration (or mini-platoons) 914) to enable a new driver to see 916 the incident message and to assess why platoon control is being switched or reconfigured., 0065 The platoon configuration information is transmitted to the vehicles in the platoon via messages with a mini- platoon or convoy identifier, a lead vehicle identifier, and a position number in the mini-platoon (or convoy) for the follower vehicle. For one embodiment, the platoon configuration information may be transmitted to an ITS service station. The platoon vehicles perform actions, such as changing lanes to follow a new lead vehicle for a mini-platoon, in accordance with the selected platoon configuration);
at least one processor (0091 processor); and
a memory having instructions stored thereon, wherein the instructions when executed by the at least one processor (0095 the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.), cause the at least one processor to:
obtain vehicle data and/or environmental data via the at least one sensing device (0057 A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle.);
determine one or more following or coupling parameters for forming a variable AV train (0051 FIG. 6 is a message sequence diagram 600 for a scenario where a platoon is split into two or more mini-platoons. A lead vehicle 604 sends a RequestlncidentMap message 608 to a trigger system 602. A trigger system (or trigger) 602 for reconfiguring a platoon or selecting a new lead (or supervisor) vehicle may determine whether a particular driving scenario may benefit from a reconfiguration of a platoon.);
electronically and/or mechanically connect and disconnect to at least another AV to form the variable AV train (0038 In this scenario, passenger cars 306, 3 10 form a new mini-platoon of cars 3 16, 320 on the right side of FIG. 3. The new mini-platoon of cars 3 16, 320 may make changes, such as slowing down and making maneuvers to pass the obstacle 322, that are not performed by the platoon of big trucks 3 12, 3 14, 3 18. The platoon leader 302 may continue as platoon leader 1 (3 12) of the big trucks 3 12, 3 14, 3 18 while platoon leader 2 (3 16) leads the mini-platoon of cars 3 16, 320., 0039, 0062 If a plurality of vehicles forms a platoon and switches to platoon mode 802, several initial conditions exist. Each vehicle in the platoon is in platooning mode. Vehicle 1 is the front (or lead) vehicle and master (or controller) of the platoon. At least vehicle 1 and one other vehicle have a driver behind the steering wheel. A person behind a steering wheel may not actually be driving a vehicle. A "see-through" application is running continuously in the front vehicle to transmit video data to the other vehicles in the platoon and to enable other drivers a front view of the platoon, 0063 When all of the vehicles in a platoon are in platoon mode, a system (or lead vehicle for some embodiments) sends driver availability requests to each vehicle in the platoon); and
continuously broadcast the obtained vehicle and/or environmental data to the at least another AV subsequent to forming the variable AV train (0033, 0057 see-through video of the current front vehicle is transmitted to the HMI display in the new leader vehicle. A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle., 0062 At least vehicle 1 and one other vehicle have a driver behind the steering wheel. A person behind a steering wheel may not actually be driving a vehicle. A "see-through" application is running continuously in the front vehicle to transmit video data to the other vehicles in the platoon and to enable other drivers a front view of the platoon.).
Claim 2
Kutila discloses:
wherein each of the plurality of AVs is configured to obtain the vehicle data and/or environmental data or form the variable AV train in response to a request or command received from a computing device or at least one mobile device (0029 For some embodiments, a system may receive via a V2X channel from other vehicles or from an ITS service station a message requesting a platoon configuration change. This message may be sent for safety reasons or for maintaining autonomous platooning mode. A platoon may be split into two platoons for multiple reasons, such as overtaking the lead vehicle (or front vehicle) in two sections or passing through road maintenance work where passenger cars use one lane and trucks use another lane., 0065, 0068).
Claim 3
Kutila discloses:
wherein the instructions when executed by each processor cause each processor to further: self-navigate using the one or more following parameters and/or the broadcast and/or received data (0005 continue an autonomous driving mode without disbanding a platoon, 0043 412, 420, a human-machine interface (HMI) 414, 422, a driver state analysis module 416, 424, and a platooning function 418, 426. A communications unit 412, 420 may use a data exchange channel (such as DSRC and 5G) 410 to transmit and receive data to and from platooning vehicles 402, 404, other vehicles 406, and ITS service stations 408.).
Claim 4
Kutila discloses:
wherein operations of each AV are optimized using one or more algorithms or machine learning models (0070 The lead vehicle may determine the most accurate measurement data (including sensor and video data) or the vehicle using the best object detection algorithms and use data under those scenarios. Note that Kutila discloses that any vehicle may become the lead vehicle, and thus any vehicle may use the recited algorithm).
Claim 7
Kutila discloses:
wherein each processor is configured to continuously broadcast the obtained vehicle and/or environmental data using a Bluetooth BLE advertising mode, short-range wireless communication protocol, or Near-Field Communication (NFC) protocol (0098 Bluetooth, 0082 the communications system 100 may be a multiple access system and may employ one or more channel-access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 114a in the RAN 103/104/105 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA). [0083] In another embodiment, the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 115/116/117 using Long Term Evolution (LTE) and/or LTE- Advanced (LTE- A).).
Claim 8
Kutila discloses:
wherein the vehicle data and/or environmental data includes at least one of speed, location, acceleration, relative distance, or relative speed (0064 sensor data (such as location of the detected object) and video data from the front vehicle are transmitted to a new lead vehicle (based on a new platoon configuration (or mini-platoons) 914) to enable a new driver to see 916 the incident message and to assess why platoon control is being switched or reconfigured., 0065 The platoon configuration information is transmitted to the vehicles in the platoon via messages with a mini- platoon or convoy identifier, a lead vehicle identifier, and a position number in the mini-platoon (or convoy) for the follower vehicle. For one embodiment, the platoon configuration information may be transmitted to an ITS service station. The platoon vehicles perform actions, such as changing lanes to follow a new lead vehicle for a mini-platoon, in accordance with the selected platoon configuration).
Claim 9
Kutila discloses:
wherein each AV is configured to self-navigate using a reinforcement learning model or optimization operation (0042 An exemplary system operates to maximize use of autonomous mode for platooning vehicles while minimizing interventions with drivers and vehicle occupants., 0037 For some embodiments, when an in-vehicle computer detects an incident or lane obstacle that may result in a platoon split, the computer will calculate an optimal number of mini- platoons. Other vehicles may be allowed to be assigned the lead driver (or supervisor) role after checking which vehicles are available. Also, a system will minimize manual driving and generate an optimal number of mini-platoons.).
Claim 10
Kutila discloses:
wherein each AV comprises a drive-by-wire component (0125 Note that various hardware elements of one or more of the described embodiments are referred to as "modules" that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation. Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module).
Claim 11
Kutila discloses an autonomous vehicle (AV) (0042, 0029) comprising:
at least one sensing device (0064 sensor data (such as location of the detected object) and video data from the front vehicle are transmitted to a new lead vehicle (based on a new platoon configuration (or mini-platoons) 914) to enable a new driver to see 916 the incident message and to assess why platoon control is being switched or reconfigured., 0065 The platoon configuration information is transmitted to the vehicles in the platoon via messages with a mini- platoon or convoy identifier, a lead vehicle identifier, and a position number in the mini-platoon (or convoy) for the follower vehicle. For one embodiment, the platoon configuration information may be transmitted to an ITS service station. The platoon vehicles perform actions, such as changing lanes to follow a new lead vehicle for a mini-platoon, in accordance with the selected platoon configuration, 0045 The lead vehicle may have the best visual view for events in front of the platoon. Video from the lead vehicle may be displayed on a see-through video display module within each vehicle in the platoon. For some embodiments, when a new leader (or supervisor) is determined to lead a platoon, the see-through video display may switch to another vehicle., 0070 For some embodiments, information transmitted to the lead vehicle by other vehicles may include sensor data or video data recorded by another vehicle. For other embodiments, the lead vehicle may receive environment perception system data or vehicle computer system data from other vehicles in the platoon. The lead vehicle may determine the most accurate measurement data (including sensor and video data) or the vehicle using the best object detection algorithms and use data under those scenarios., 0057 A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle.);
at least one processor (e.g., cloud-based processing system) in electronic communication with the at least one vehicle (0091 processor); and
a memory having instructions thereon, wherein the instructions when executed by the processor (0095 the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.), cause the processor to:
obtain vehicle data and/or environmental data via the at least one sensing device (0057 A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle.);
determine one or more following or coupling parameters for forming a variable AV train (0051 FIG. 6 is a message sequence diagram 600 for a scenario where a platoon is split into two or more mini-platoons. A lead vehicle 604 sends a RequestlncidentMap message 608 to a trigger system 602. A trigger system (or trigger) 602 for reconfiguring a platoon or selecting a new lead (or supervisor) vehicle may determine whether a particular driving scenario may benefit from a reconfiguration of a platoon.);
electronically and/or mechanically connect and disconnect to at least another AV to form the variable AV train (0038 In this scenario, passenger cars 306, 3 10 form a new mini-platoon of cars 3 16, 320 on the right side of FIG. 3. The new mini-platoon of cars 3 16, 320 may make changes, such as slowing down and making maneuvers to pass the obstacle 322, that are not performed by the platoon of big trucks 3 12, 3 14, 3 18. The platoon leader 302 may continue as platoon leader 1 (3 12) of the big trucks 3 12, 3 14, 3 18 while platoon leader 2 (3 16) leads the mini-platoon of cars 3 16, 320., 0039, 0062 If a plurality of vehicles forms a platoon and switches to platoon mode 802, several initial conditions exist. Each vehicle in the platoon is in platooning mode. Vehicle 1 is the front (or lead) vehicle and master (or controller) of the platoon. At least vehicle 1 and one other vehicle have a driver behind the steering wheel. A person behind a steering wheel may not actually be driving a vehicle. A "see-through" application is running continuously in the front vehicle to transmit video data to the other vehicles in the platoon and to enable other drivers a front view of the platoon, 0063 When all of the vehicles in a platoon are in platoon mode, a system (or lead vehicle for some embodiments) sends driver availability requests to each vehicle in the platoon); and
continuously broadcast the obtained vehicle and/or environmental data to the at least another AV subsequent to forming the variable AV train (0033, 0057 see-through video of the current front vehicle is transmitted to the HMI display in the new leader vehicle. A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle., 0062 At least vehicle 1 and one other vehicle have a driver behind the steering wheel. A person behind a steering wheel may not actually be driving a vehicle. A "see-through" application is running continuously in the front vehicle to transmit video data to the other vehicles in the platoon and to enable other drivers a front view of the platoon.).
Claim(s) 12, 13, 14, 17 and 18
Claim(s) 12, 13, 14, 17 and 18 recite(s) subject matter similar to that/those of claim(s) 2, 3, 4, 7 and 8, respectively, and is/are rejected under the same grounds.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 USC 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 5, 6, 15 and 16 are rejected under 35 USC 103 as being unpatentable over Kutila in view of US20220388530 (“Patne”).
Claim 5
Kutila fails to disclose wherein each processor is further configured to dynamically switch between obtaining data from the at least one sensing device to at least one mobile device within the AV based on detected environmental conditions. However, Kutila does disclose sources of data to include both the at least one sensing device and at least one mobile device within the AV (0078 By way of example, the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like, 0070 For some embodiments, information transmitted to the lead vehicle by other vehicles may include sensor data or video data recorded by another vehicle. For other embodiments, the lead vehicle may receive environment perception system data or vehicle computer system data from other vehicles in the platoon. The lead vehicle may determine the most accurate measurement data (including sensor and video data) or the vehicle using the best object detection algorithms and use data under those scenarios.). Furthermore, Patne teaches a system of collecting data in an autonomous vehicle (claim 1), including:
wherein each processor is further configured to dynamically switch between obtaining data from the at least one sensing device to at least one mobile device within the AV based on detected environmental conditions (0068 In one embodiment, the transport 120 may receive sensor data from one or more occupant devices within the transport 120, where the sensor data from the occupant devices indicates proper function. Occupant devices may include any type of computing device, including but not limited to smart phones, tablets, notebook computers, smart watches, wearable computers and the like. A camera in the occupant device may transmit camera video over a Bluetooth or wi-fi connection to the display or entertainment processor of the transport 120, which may then relay the camera video to the main processor of the transport 120. For example, in the case of a malfunctioning blind spot sensor 130, a transport 120 occupant may use the camera in a smart phone to capture another transport in the transport's 120 blind spot. The smart phone may transmit the camera images to the transport 120, which may replace information from the malfunctioning sensor 130 with the sensor data (e.g., camera images) from the one or more occupant devices. Examiner notes that such technique of switching data sources could be applied to any of the vehicles in the train of Kutila).
Kutila and Patne both disclose systems of data collection from various sources for autonomous vehicles. Thus, it would have been obvious to one having ordinary skill in the art before the effective filing date of Applicant's invention to modify the system in Kutila to include the teaching of Patne with a reasonable expectation of success in order improve vehicle and passenger safety by providing accurate and current data to the autonomous vehicle during sensor malfunction.
Claim 6
Kutila fails to disclose wherein the environmental conditions include at least one of poor visibility, sensor malfunction, and low battery. However, Kutila does disclose capturing environmental conditions (0031). Furthermore, Patne teaches:
wherein the environmental conditions include at least one of poor visibility, sensor malfunction, and low battery (0068 In one embodiment, the transport 120 may receive sensor data from one or more occupant devices within the transport 120, where the sensor data from the occupant devices indicates proper function. Occupant devices may include any type of computing device, including but not limited to smart phones, tablets, notebook computers, smart watches, wearable computers and the like. A camera in the occupant device may transmit camera video over a Bluetooth or wi-fi connection to the display or entertainment processor of the transport 120, which may then relay the camera video to the main processor of the transport 120. For example, in the case of a malfunctioning blind spot sensor 130, a transport 120 occupant may use the camera in a smart phone to capture another transport in the transport's 120 blind spot. The smart phone may transmit the camera images to the transport 120, which may replace information from the malfunctioning sensor 130 with the sensor data (e.g., camera images) from the one or more occupant devices.).
See prior art rejection of claim 5 for obviousness and reasons to combine.
Claim(s) 15 and 16
Claim(s) 15 and 16 recite(s) subject matter similar to that/those of claim(s) 5 and 6, respectively, and is/are rejected under the same grounds.
Claim 19 is rejected under 35 USC 103 as being unpatentable over US20210150429 (“Atanashu”) in view of Kutila.
Claim 19
Atanashu discloses a computer-implemented method (abstract) comprising:
receiving, by at least one processor, at least one ride hailing request (abstract: receiving a first trip request including a first vehicle identification, a first trip origin, and a first trip destination);
determining, by the at least one processor, one or more optimal docking location(s) based at least in part on the at least one ride hailing request (0005 Methods may include providing for route guidance of the first vehicle along the first route and the second vehicle along the second route. The platooning plan may include a joining location where the first route begins to overlap the second route.);
determining, by the at least one processor, at least one AV route to the one or more optimal docking location(s) (0057 The vehicles may each be provided with a route plan which may be in the form of route guidance, where route guidance not only provides navigational information to follow a route, but also provides information pertaining to platooning and vehicle control. For example, route guidance may include joining locations where the vehicle may join or form a platoon. Route guidance may also include platooning parameters, such as following distance or time gap, speed restrictions, or other information to facilitate travel along the route.); and
triggering, by the at least one processor, docking operations, undocking operations, and/or navigation of at least one variable AV train based at least in part on the at least one AV route (0057 The vehicles may each be provided with a route plan which may be in the form of route guidance, where route guidance not only provides navigational information to follow a route, but also provides information pertaining to platooning and vehicle control. For example, route guidance may include joining locations where the vehicle may join or form a platoon. Route guidance may also include platooning parameters, such as following distance or time gap, speed restrictions, or other information to facilitate travel along the route.),
wherein the at least one variable AV train comprises a plurality of AVs that are each configured to:
determine one or more following or coupling parameters for forming the at least one variable AV train (0049 Embodiments described herein may implement a central, cloud-based platoon matching exchange, which may be hosted by a server (e.g. server 12 or platoon matching exchange server 32), which could continually ingest origins, destinations, and acceptable parameters of a trip for a particular vehicle in order to appropriately match each vehicle with a platoon that would most appropriately and efficiently align with the user's planned trip. Parameters for a trip may include, for example, waypoints, types of roadways (e.g., avoiding interstate highways or preferring interstate highways), other vehicles making the trip associated with the user (e.g., if a company has a fleet of vehicles traveling on a given road network sharing routes of the trips partially or entirely), timing of the trip (e.g., departure or desired arrival) etc.), and
electronically and/or mechanically connect and disconnect to at least another AV to form the at least one variable AV train (0049 Embodiments described herein may implement a central, cloud-based platoon matching exchange, which may be hosted by a server (e.g. server 12 or platoon matching exchange server 32), which could continually ingest origins, destinations, and acceptable parameters of a trip for a particular vehicle in order to appropriately match each vehicle with a platoon that would most appropriately and efficiently align with the user's planned trip. Parameters for a trip may include, for example, waypoints, types of roadways (e.g., avoiding interstate highways or preferring interstate highways), other vehicles making the trip associated with the user (e.g., if a company has a fleet of vehicles traveling on a given road network sharing routes of the trips partially or entirely), timing of the trip (e.g., departure or desired arrival) etc., 0057 route guidance may include joining locations where the vehicle may join or form a platoon. Route guidance may also include platooning parameters, such as following distance or time gap, speed restrictions, or other information to facilitate travel along the route.).
Atanashu fails to explicitly disclose wherein the at least one variable AV train comprises a plurality of AVs that are each configured to continuously broadcast obtained vehicle and/or environmental data to at least another AV subsequent to forming the at least one variable AV train. However, Atanashu does disclose forming the at least one variable AV train (0057). Furthermore, Kutila discloses a system of forming an AV train (0038 In this scenario, passenger cars 306, 3 10 form a new mini-platoon of cars 3 16, 320 on the right side of FIG. 3. The new mini-platoon of cars 3 16, 320 may make changes, such as slowing down and making maneuvers to pass the obstacle 322, that are not performed by the platoon of big trucks 3 12, 3 14, 3 18. The platoon leader 302 may continue as platoon leader 1 (3 12) of the big trucks 3 12, 3 14, 3 18 while platoon leader 2 (3 16) leads the mini-platoon of cars 3 16, 320., 0039); including:
wherein the at least one variable AV train comprises a plurality of AVs that are each configured to continuously broadcast obtained vehicle and/or environmental data to at least another AV subsequent to forming the at least one variable AV train (0033, 0057 see-through video of the current front vehicle is transmitted to the HMI display in the new leader vehicle. A SendVideoData message 622 is sent from the lead vehicle 604 to another vehicle 606 in the platoon. The message 622 contains video data captured by the lead vehicle 604 in assessing an undetected (or unknown) object. The message 622 has one field, VideoDataStream. The VideoDataStream field is a video data stream that is recorded by a camera attached to the lead vehicle., 0062 At least vehicle 1 and one other vehicle have a driver behind the steering wheel. A person behind a steering wheel may not actually be driving a vehicle. A "see-through" application is running continuously in the front vehicle to transmit video data to the other vehicles in the platoon and to enable other drivers a front view of the platoon.).
Atanashu and Kutila both disclose systems of forming an AV train. Thus, it would have been obvious to one having ordinary skill in the art before the effective filing date of Applicant's invention to modify the system in Atanashu to include the teaching of Kutila with a reasonable expectation of success in order to increase safety of the following vehicles by providing them with information of obstacles and events ahead of the lead vehicle.
Claim 20 is rejected under 35 USC 103 as being unpatentable over Atanashu in view of Kutila, in further view of Patne.
Claim 20
Atanashu fails to disclose wherein each AV is configured to dynamically switch between obtaining data from at least one sensing device and at least one mobile device within the AV based on detected environmental conditions. However, Atanashu does disclose sources of data to include both the at least one sensing device and at least one mobile device within the AV (0085 Trailing vehicles may use autonomous or semi-autonomous control to maintain following distances and to follow the lead of the vehicle in front of them. This may be performed by one or both of sensors of the trailing vehicle providing information regarding a vehicle they are following, 0091 As shown, map data 102 may be stored in a database and include road information such as the types of road and relevant information such as curvature, topography, maximum heights and widths, etc. A data gatherer 104 may extract data from various sources or scrape data from network elements for regulatory data, traffic, weather, etc). Furthermore, Patne teaches a system of collecting data in an autonomous vehicle (claim 1), including:
wherein each AV is configured to dynamically switch between obtaining data from at least one sensing device and at least one mobile device within the AV based on detected environmental conditions (0068 In one embodiment, the transport 120 may receive sensor data from one or more occupant devices within the transport 120, where the sensor data from the occupant devices indicates proper function. Occupant devices may include any type of computing device, including but not limited to smart phones, tablets, notebook computers, smart watches, wearable computers and the like. A camera in the occupant device may transmit camera video over a Bluetooth or wi-fi connection to the display or entertainment processor of the transport 120, which may then relay the camera video to the main processor of the transport 120. For example, in the case of a malfunctioning blind spot sensor 130, a transport 120 occupant may use the camera in a smart phone to capture another transport in the transport's 120 blind spot. The smart phone may transmit the camera images to the transport 120, which may replace information from the malfunctioning sensor 130 with the sensor data (e.g., camera images) from the one or more occupant devices. Examiner notes that such technique of switching data sources could be applied to any of the vehicles in the train of Atanashu).
Atanashu and Patne both disclose systems of data collection from various sources for autonomous vehicles. Thus, it would have been obvious to one having ordinary skill in the art before the effective filing date of Applicant's invention to modify the system in Atanashu to include the teaching of Patne with a reasonable expectation of success in order improve vehicle and passenger safety by providing accurate and current data to the autonomous vehicle during sensor malfunction.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Examiner KRISHNAN RAMESH whose telephone number is (571)272-6407. The examiner can normally be reached Monday-Friday 8:30am-5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Flynn, can be reached at (571)272-9855. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KRISHNAN RAMESH/
Primary Examiner, Art Unit 3663