DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1 and 9 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 14-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Altman (U.S. Patent No. 12197209; hereinafter Altman).
Regarding claim 14, Altman teaches a system for marshalling a plurality of vehicles (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle; Col. 43, lines 46-51; i.e., the vehicular AI unit of said vehicle operates … to transmit said tele-operating command to said other vehicle),
the system comprising: a set of infrastructure sensors associated with a vehicle management system configured to process signals from the set of infrastructure sensors to calculate one or more vehicle commands based on the processed signals from the set of infrastructure sensors (Altman: Col. 6, lines 21-24; i.e., infrastructure elements 160 may comprise one or more sensors 161 (e.g., from cameras, microphones, sensors, Light Detection and Ranging (LIDAR) sensors, RADARSs or RADAR sensors, or the like)); Col. 4, lines 7-22; i.e., a remote tele-operation processor 172, which may be part of remote server 170 or may be in communication with remote server 170, may process the data received from the primary vehicle 110, and may generate vehicular driving or vehicular operation commands… the remote tele-operator … may take into account other data that was not necessarily sensed or received from the primary vehicle 110; for example, … data sensed or received from infrastructure elements);
and one or more sensors on-board each vehicle of the plurality of vehicles that generate signals (Altman: Col. 37; i.e., vehicular sensors 451 (e.g., cameras, imagers, video cameras, microphones, LIDAR sensors, RADAR sensors…); Col. 43, lines 46-48; i.e., the vehicular AI unit of said vehicle operates (I) to perform AI processing of data sensed by sensors of another vehicle; signals are generated by sensors on the primary vehicle and other vehicles),
wherein each vehicle of the plurality of vehicles is configured to process the one or more vehicle commands calculated by the vehicle management system and the signals generated from the one or more sensors on-board each vehicle of the plurality of vehicles to generate new commands that are sent to each vehicle of the plurality of vehicles to marshal each vehicle of the plurality of vehicles to a location (Altman: Col. 18, lines 57-60; i.e., the waypoints or driving instructions may be sent… by the remote driver or remote AI; Col. 30, line 58 – Col. 31, line 8; i.e., as the autonomous car travels further, newly captured data or freshly acquired data enables the vehicular computer to identify a traffic light … which now turns from green light to red light… these commands may be updated on an ongoing basis, as the vehicle travels further and approaches the traffic light, and as freshly-sensed data is acquired and provides the vehicular computer new information to maintain its previous commands or to modify them; Col. 5, lines 61-64; i.e., the vehicle would perform the tele-driving command unless it is in conflict with a local autonomous driving command which would thus prevail and would be executed; the vehicular computer continuously processes the commands from the external AI and newly captured data from onboard sensors to generate new commands to control the vehicle to move to the next waypoint).
Regarding claim 15, Altman teaches the system according to claim 14. Altman further teaches wherein the new commands are processed in each vehicle of the plurality of vehicles (Altman: Col. 30, lines 31-34; i.e., a vehicular processor or vehicular computer or other vehicular (in-vehicle) controller, may command one or more parts or components of the vehicle to perform certain operations; the vehicular computer that processes the commands and on-board sensor data and is located in the vehicle).
Regarding claim 16, Altman teaches the system according to claim 14. Altman further teaches wherein each vehicle of the plurality of vehicles executes its own modified commands (Altman: Col. 34, lines 21-24; i.e., data may be processed by the external or remote server/AI unit 270, which in turn may provide tele-operating commands to one or more of the vehicles 210-212; Col. 36, lines 58-60; i.e., tele-operating commands that can be immediately executed by the specific recipient vehicle).
Regarding claim 17, Altman teaches the system according to claim 14. Altman further teaches wherein the new commands account for dynamic obstacles within a zone (Altman: Col. 12, lines 11-13; i.e., make real time decisions regarding other vehicles or pedestrians or other objects).
Regarding claim 18, Altman teaches the system according to claim 14. Altman further teaches wherein the one or more sensor on-board each vehicle of the plurality of vehicles includes at least one of a camera, a lidar, a radar, and an ultrasonic sensor (Altman: Col. 37, lines 50-52; i.e., vehicular sensors 451 (e.g., cameras, imagers, video cameras, microphones, LIDAR sensors, RADAR sensors…)).
Regarding claim 19, Altman teaches the system according to claim 14. Altman further teaches wherein the set of infrastructure sensors includes at least one of a camera, a lidar, and a radar (Altman: Col. 6, lines 21-24; i.e., infrastructure elements 160 may comprise one or more sensors 161 (e.g., from cameras, microphones, sensors, Light Detection and Ranging (LIDAR) sensors, RADARSs or RADAR sensors, or the like)).
Regarding claim 20, Altman teaches the system according to claim 14. Altman further teaches wherein differences between the vehicle commands calculated by the vehicle management system and the new commands generated by each vehicle of the plurality of vehicles are sent back to the vehicle management system for reconciliation to generate subsequent vehicle commands (Altman: Col. 40, lines 14-18; i.e., detect an inconsistency between the first and second tele-driving commands, wherein said inconsistency comprises at least one of: contradiction, mismatch, duplication, adverse effects, contradictory results; Col. 31, lines 3-4; i.e., these commands may be updated on an ongoing basis; the system repeatedly generates new commands such as if there is an inconsistency in the commands),
and wherein each vehicle of the plurality of vehicles executes the subsequent vehicle commands (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle; Col. 36, lines 58-60; i.e., tele-operating commands that can be immediately executed by the specific recipient vehicle; each vehicle executes a command).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-13 are rejected under 35 U.S.C. 103 as being unpatentable over Altman and further in view of Lee et al. (U.S. Publication No. 2023/0055708).
Regarding claim 1, Altman teaches a method for marshalling a vehicle (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle),
the method comprising: receiving signals from a set of infrastructure sensors associated with a vehicle management system (Altman: Col. 4, lines 21-27; i.e., data sensed or received from infrastructure elements … may be obtained or received via a data fetching unit 173, which may be part of remote server; Col. 6, lines 21-22; i.e., infrastructure elements 160 may comprise one or more sensors);
processing, by the vehicle management system, the signals from the set of infrastructure sensors (Altman: Col. 6, lines 35-36; i.e., the remote server 170 may comprise or may control, or may be associated with, a remote AI module; Col. 36, lines 46-49; i.e., the external AI unit performs AI processing of the vehicular-sensed data (block 330), optionally in combination with other data that is available to the AI unit from other sources; the remote server processes the data from the infrastructure sensors);
calculating, by the vehicle management system, one or more vehicle commands based on the processed signals; sending the calculated one or more vehicle commands to the vehicle (Altman: Col. 36, lines 51-54; i.e., The external AI unit … generates one or more tele-operating commands, and transmits them back to the vehicle);
receiving signals from one or more sensors on-board the vehicle (Altman: Col. 6, lines 38-40; i.e., data is sensed by the multiple sensors, such as vehicular sensors 111 of primary vehicle);
and processing, by the vehicle, the one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle (Altman: Col. 18, lines 57-60; i.e., the waypoints or driving instructions may be sent… by the remote driver or remote AI; Col. 30, line 58 – Col. 31, line 8; i.e., as the autonomous car travels further, newly captured data or freshly acquired data enables the vehicular computer to identify a traffic light … which now turns from green light to red light… these commands may be updated on an ongoing basis, as the vehicle travels further and approaches the traffic light, and as freshly-sensed data is acquired and provides the vehicular computer new information to maintain its previous commands or to modify them).
Altman does not explicitly teach processing, by the vehicle, the one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle; and generating, by the vehicle, new commands based on the fused one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle to marshal the vehicle to a location.
However, in the same field of endeavor, Lee teaches processing, by the vehicle, the one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle; and generating, by the vehicle, new commands based on the fused one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle to marshal the vehicle to a location (Lee: Par. 610; i.e., the route provision apparatus 800 may receive map information from the server; Par. 18; i.e., the received map information reflects the route information to the destination; Par. 328; i.e., perform communication with a server and the like by using a satellite navigation system; Par. 614; i.e., when the processor 830 fuses the map information received from the navigation system with the sensing information sensed by the sensors of the vehicle, the processor 830 may perform the fusion based on the route information to the destination; Par. 615; the processor 830 may … generate a SLAM map only for a predetermined range including route information, along which the vehicle is to travel to a destination, based on the route information; Par. 633; i.e., the processor 830 may use the generated SLAM map to generate an optimal route; the received route information (vehicle command) is fused with vehicle sensor data to generate a new optimal route).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Altman to have further incorporated processing, by the vehicle, the one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle; and generating, by the vehicle, new commands based on the fused one or more vehicle commands calculated by the vehicle management system and the signals from the one or more sensors on-board the vehicle to marshal the vehicle to a location, as taught by Lee. Doing so would allow the system to update the route based on the on-board sensor data (Lee: Par. 34; i.e., provide a route provision apparatus that is optimized for generating or updating autonomous driving visibility information).
Regarding claim 2, Altman in view of Lee teaches the method according to claim 1. Altman further teaches wherein the one or more vehicle commands and the signals from the one or more sensors on-board the vehicle are processed in the vehicle (Altman: Col. 30, lines 31-34; i.e., a vehicular processor or vehicular computer or other vehicular (in-vehicle) controller, may command one or more parts or components of the vehicle to perform certain operations; the vehicular computer that processes the commands and on-board sensor data and is located in the vehicle).
Regarding claim 3, Altman in view of Lee teaches the method according to claim 1. Altman further teaches further comprising sending differences between the vehicle commands calculated by the vehicle management system and the new commands generated by the vehicle back to the vehicle management system for reconciliation to generate subsequent vehicle commands (Altman: Col. 40, lines 14-18; i.e., detect an inconsistency between the first and second tele-driving commands, wherein said inconsistency comprises at least one of: contradiction, mismatch, duplication, adverse effects, contradictory results; Col. 31, lines 3-4; i.e., these commands may be updated on an ongoing basis; the system repeatedly generates new commands such as if there is an inconsistency in the commands).
Regarding claim 4, Altman in view of Lee teaches the method according to claim 3. Altman further teaches wherein the vehicle executes the subsequent vehicle commands (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle).
Regarding claim 5, Altman in view of Lee teaches the method according to claim 3. Altman further teaches wherein signals communicated with the vehicle management system comprise wireless signals (Altman: Col. 43, lines 52-56; i.e., commands that are sent from a remote tele-operation terminal to said vehicular processor of said vehicles, are transmitted over a multiplicity of wireless communication transceivers that are associated with said vehicular processor).
Regarding claim 6, Altman in view of Lee teaches the method according to claim 1. Altman further teaches wherein the one or more vehicle commands include at least one of rate of change of velocity, velocity, torque and steering of the vehicle (Altman: Col. 5, lines 12-16; i.e., the received tele-operation commands are “meta-commands” (or, commands that are generic in their nature or that are provided in a format that any vehicle can interpret, such as, “come to a complete stop within two seconds” or “accelerate right now to 60 mph”)).
Regarding claim 7, Altman in view of Lee teaches the method according to claim 1. Altman further teaches wherein the one or more sensors on-board the vehicle includes at least one of a camera, a lidar, a radar, and an ultrasonic sensor (Altman: Col. 37, lines 50-52; i.e., vehicular sensors 451 (e.g., cameras, imagers, video cameras, microphones, LIDAR sensors, RADAR sensors…)).
Regarding claim 8, Altman in view of Lee teaches the method according to claim 1. Altman further teaches wherein the set of infrastructure sensors includes at least one of a camera, a lidar, and a radar (Altman: Col. 6, lines 21-24; i.e., infrastructure elements 160 may comprise one or more sensors 161 (e.g., from cameras, microphones, sensors, Light Detection and Ranging (LIDAR) sensors, RADARSs or RADAR sensors, or the like)).
Regarding claim 9, Altman teaches a method for marshalling a vehicle (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle),
the method comprising: processing signals from one or more infrastructure sensors associated with a vehicle management control system (Altman: Col. 4, lines 7-22; i.e., a remote tele-operation processor 172, which may be part of remote server 170 or may be in communication with remote server 170, may process the data received from the primary vehicle 110, and may generate vehicular driving or vehicular operation commands… the remote tele-operator … may take into account other data that was not necessarily sensed or received from the primary vehicle 110; for example, … data sensed or received from infrastructure elements);
calculating, by the vehicle management control system, one or more vehicle commands based on the processed signals from the one or more infrastructure sensors; sending the one or more calculated vehicle commands to the vehicle (Altman: Col. 36, lines 51-54; i.e., The external AI unit … generates one or more tele-operating commands, and transmits them back to the vehicle);
generating signals from one or more sensors on-board the vehicle (Altman: Col. 6, lines 38-40; i.e., data is sensed by the multiple sensors, such as vehicular sensors 111 of primary vehicle);
processing, by the vehicle, the one or more vehicle commands calculated by the vehicle management control system and the signals generated from the one or more sensors on-board the vehicle; generating, by the vehicle, new commands for the vehicle (Altman: Col. 18, lines 57-60; i.e., the waypoints or driving instructions may be sent… by the remote driver or remote AI; Col. 30, line 58 – Col. 31, line 8; i.e., as the autonomous car travels further, newly captured data or freshly acquired data enables the vehicular computer to identify a traffic light … which now turns from green light to red light… these commands may be updated on an ongoing basis, as the vehicle travels further and approaches the traffic light, and as freshly-sensed data is acquired and provides the vehicular computer new information to maintain its previous commands or to modify them; Col. 5, lines 61-64; i.e., the vehicle would perform the tele-driving command unless it is in conflict with a local autonomous driving command which would thus prevail and would be executed; the vehicular computer continuously processes the commands from the external AI and newly captured data from onboard sensors to generate new commands to control the vehicle to move to the next waypoint),
sending, by the vehicle, differences between the one or more vehicle commands calculated by the vehicle management system and the new commands generated by the vehicle by wireless communications to the vehicle management control system for reconciliation; generating, by the vehicle management control system, subsequent vehicle commands based on the differences (Altman: Col. 40, lines 14-18; i.e., detect an inconsistency between the first and second tele-driving commands, wherein said inconsistency comprises at least one of: contradiction, mismatch, duplication, adverse effects, contradictory results; Col. 31, lines 3-4; i.e., these commands may be updated on an ongoing basis; the system repeatedly generates new commands such as if there is an inconsistency in the commands),
and executing the subsequent vehicle commands by the vehicle to marshal the vehicle to a location (Altman: Col. 38, lines 10-12; i.e., generate commands and/or signals that are then performed or utilized by a self-driving or autonomous driving unit of the vehicle).
Altman does not explicitly teach generating, by the vehicle, new commands for the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle.
However, in the same field of endeavor, Lee teaches generating, by the vehicle, new commands for the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle (Lee: Par. 610; i.e., the route provision apparatus 800 may receive map information from the server; Par. 18; i.e., the received map information reflects the route information to the destination; Par. 328; i.e., perform communication with a server and the like by using a satellite navigation system; Par. 614; i.e., when the processor 830 fuses the map information received from the navigation system with the sensing information sensed by the sensors of the vehicle, the processor 830 may perform the fusion based on the route information to the destination; Par. 615; the processor 830 may … generate a SLAM map only for a predetermined range including route information, along which the vehicle is to travel to a destination, based on the route information; Par. 633; i.e., the processor 830 may use the generated SLAM map to generate an optimal route; the received route information (vehicle command) is fused with vehicle sensor data to generate a new optimal route).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Altman to have further incorporated generating, by the vehicle, new commands for the vehicle by fusing the one or more vehicle commands calculated by the vehicle management system with the signals from the one or more sensors on-board the vehicle, as taught by Lee. Doing so would allow the system to update the route based on the on-board sensor data (Lee: Par. 34; i.e., provide a route provision apparatus that is optimized for generating or updating autonomous driving visibility information).
Regarding claim 10, Altman in view of Lee teaches the method according to claim 9. Altman further teaches wherein the one or more vehicle commands and the signals from the one or more sensors on-board the vehicle are processed in the vehicle (Altman: Col. 30, lines 31-34; i.e., a vehicular processor or vehicular computer or other vehicular (in-vehicle) controller, may command one or more parts or components of the vehicle to perform certain operations; the vehicular computer that processes the commands and on-board sensor data and is located in the vehicle).
Regarding claim 11, Altman in view of Lee teaches the method according to claim 9. Altman further teaches wherein the one or more vehicle commands include at least one of rate of change of velocity, velocity, torque and steering of the vehicle (Altman: Col. 5, lines 12-16; i.e., the received tele-operation commands are “meta-commands” (or, commands that are generic in their nature or that are provided in a format that any vehicle can interpret, such as, “come to a complete stop within two seconds” or “accelerate right now to 60 mph”)).
Regarding claim 12, Altman in view of Lee teaches the method according to claim 9. Altman further teaches wherein the one or more sensors on-board the vehicle includes at least one of a camera, a lidar, a radar, and an ultrasonic sensor (Altman: Col. 37, lines 50-52; i.e., vehicular sensors 451 (e.g., cameras, imagers, video cameras, microphones, LIDAR sensors, RADAR sensors…)).
Regarding claim 13, Altman in view of Lee teaches the method according to claim 9. Altman further teaches wherein the one or more infrastructure sensors include at least one of a camera, a lidar, and a radar (Altman: Col. 6, lines 21-24; i.e., infrastructure elements 160 may comprise one or more sensors 161 (e.g., from cameras, microphones, sensors, Light Detection and Ranging (LIDAR) sensors, RADARSs or RADAR sensors, or the like)).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON Z WILLIS whose telephone number is (571)272-5427. The examiner can normally be reached Weekdays 8:00-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin D. Bishop can be reached at (571) 270-3713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRANDON Z WILLIS/Examiner, Art Unit 3665