DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/27/2026 has been entered.
Priority
Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged. Applicant has not complied with one or more conditions for receiving the benefit of an earlier filing date under 35 U.S.C. 119(e) as follows:
The later-filed application must be an application for a patent for an invention which is also disclosed in the prior application (the parent or original nonprovisional application or provisional application). The disclosure of the invention in the parent application and in the later-filed application must be sufficient to comply with the requirements of 35 U.S.C. 112(a) or the first paragraph of pre-AIA 35 U.S.C. 112, except for the best mode requirement. See Transco Products, Inc. v. Performance Contracting, Inc., 38 F.3d 551, 32 USPQ2d 1077 (Fed. Cir. 1994)
The disclosure of the prior-filed application, Application Nos. 62/697912, 62/697915, 62/697919, 62/697922, 62/697930, 62/697938, 62/697940, 62/697946, 62/697952, 62/697957, 62/697960, 62/697962, 62/697965, 62/697969, and 62/697971 fail to provide adequate support or enablement in the manner provided by 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, first paragraph for one or more claims of this application. The subject matter of claims 1-20 is not described in any of the prior-filed applications listed above to provide adequate support in the manner provide by 35 U.S.C. 112(a). Accordingly, claims 1-20 are not entitles to the benefit of the prior applications listed above.
For the purposes of examination, regarding the received benefit:
Claims 1-5, 7, 9-10, and 16-20 are given the benefit of the prior-filed application no. PCT/US19/041720, with a filing date of 07/12/2019
Response to Amendment
This action is in response to amendments and remarks filed on 07/16/2025. Claims 1-5, 7, 9-10, and 16-20 are considered in this office action. Claims 1-2, 9, and 16-17 have been amended. Claims 6, 8, and 11-15 have been cancelled. Claims 1-5, 7, 9-10, and 16-20 are pending examination. The U.S.C. 112(a) rejections of claims 1-5, 7, 9-10, and 16-20 are withdrawn in light of the instant amendments.
Response to Arguments
Applicant presents the following arguments regarding the previous office action:
“Applicant disagrees with the Examiner’s characterization of the references…Applicant respectfully submits that pending claims are in form for allowance and are not taught or suggested by the cited references.”
Applicant's argument A. has been fully considered but it is not persuasive.
Applicant's arguments fail to comply with 37 CFR 1.111(b) because they amount to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references.
Applicant's arguments do not comply with 37 CFR 1.111(c) because they do not clearly point out the patentable novelty which he or she thinks the claims present in view of the state of the art disclosed by the references cited or the objections made. Further, they do not show how the amendments avoid such references or objections.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-5, 7, 10, and 16-20 are rejected under 35 U.S.C. 103 as being unpatentable over Silver et al. (US 9,558,659 B1) in view of Neubecker et al. (US 2017/0243481 A1).
Regarding claim 1, Silver teaches “A method comprising: receiving first image data from a first sensor of a first vehicle (Col. 1 lines 18-20 teaches the vehicle perceives and interprets its surroundings using cameras); identifying a first object based on the first image data (Col. 1 lines 45-47 teaches identifying one or more traffic control factors (objects) relating to a first vehicle); determining that an aspect of the first object has changed; determining that an issue exists when [a parameter] of the second object does not change based on the change of the aspect of the first object, wherein the determining that the issue exists consists of, based upon the first image data, inferring a failure of an expected causal relationship between the change of the aspect of the first object and [the parameter] of the second object; receiving second sensor data from a second sensor of the first vehicle; determining that the issue exists when the second sensor data from the second sensor exceeds a predetermined threshold confirming the failure of the expected causl relationship; and sending an instruction to the first vehicle to complete autonomously an operation in response to the issue (Col. 2 lines 6-13 teaches detecting, by one or more computing devices (second sensor), that a (second) vehicle has remained stationary for a predetermined period of time; and changing the determination for the (second) vehicle from the first stationary state to the second stationary state (determines issue exists) based on the (second) vehicle having remained stationary for the predetermined period of time (sensor data exceeds threshold); Col. 7 lines 2-5 teaches the perception system 172 includes one or more cameras (first sensor which produces first image data); Col. 9 lines 58-67 teaches autonomous vehicle 100 detects that traffic light 536 (first object) is currently showing a red light and determines vehicle 520 to be in a short-term stationary state, but when traffic light 536 turns green (first object changes) and vehicle 520 remains stationary for some predetermined length of time (a parameter (movement of vehicle 520) does not change based on a change of the first object), autonomous vehicle 100 updates the designation of vehicle 520 to a long-term stationary state (determines an issue exists by inferring a failure of the expected causal relationship of vehicle movement and a change of the traffic light when the vehicle 520 fails to move when the traffic light 536 changes from red to green and sends an instruction to complete autonomously))”, however Silver does not explicitly teach “in response to determining that the aspect of the first object changed, determining if there is a corresponding change of an acceleration or deceleration of a second object using the first sensor of the vehicle”; the parameter corresponding to the change of the aspect of the first object as “the acceleration or deceleration” of the second object; and “where the first vehicle autonomously conducts the operation to address the issue in response to the instruction”.
From the same field of endeavor, Neubecker teaches “in response to determining that the aspect of the first object changed, determining if there is a corresponding change of an acceleration or deceleration of a second object using the first sensor of the first vehicle”; the parameter corresponding to the change of the aspect of the first object as “the acceleration or deceleration” of the second object; and “where the first vehicle autonomously conducts the operation to address the issue in response to the instruction (Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., in response to the light turning green, determining if the second vehicle acceleration changes from a first value 0 in a previous stopped state at a red light to a second value greater than 0 in a current moving state at a green light); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change if the other vehicles are failing to move (i.e., acceleration does not change) upon a light change (i.e., the vehicle determines an issue exists when the second vehicle acceleration does not change from a first value 0 to a second value greater than 0 when the traffic light changes from red to green, and conducts an operation to generate an alert to address the issue and notify the second vehicle of the light change))”.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Silver to incorporate the teachings of Neubecker with a reasonable expectation of success to include in the method taught by Silver determining if there is a change of an acceleration or deceleration of the front vehicle in response to the aspect of the first object changing as taught by Neubecker, have the parameter taught by Silver be an acceleration of the second vehicle as taught by Neubecker, and to have the first vehicle taught by Silver autonomously conduct the operation to address the issue in response to the instruction as taught by Neubecker.
The motivation for doing so would be to allow for notification of a distracted driver when a vehicle fails to move when a traffic light changes from red to green (Neubecker, Par. [0060] lines 4-5).
Regarding claim 2, the combination of Silver and Neubecker teaches all the limitations of claim 1 above, and further teaches “identifying the second object based on the first image data (Silver, Col. 3 lines 15-17 teaches autonomous vehicle detects a stationary vehicle (second object) using data from a camera)”.
Regarding claim 3, the combination of Silver and Neubecker teaches all the limitations of claim 2 above, and further teaches “wherein the first object is a traffic light and the second object is a second vehicle (Silver, Fig. 6 traffic light 536 (first object), vehicle 520 (second object))”.
Regarding claim 4, the combination of Silver and Neubecker teaches all the limitations of claim 3 above, and further teaches “when the aspect of the first object is a color of the traffic light (Silver, Col. 9 lines 58-59 and 63-64 teaches traffic light 536 showing a red light and then turns green) (Neubecker, Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 teaches the light turns green), the method further comprising: determining that the issue exists with the second vehicle when the traffic light changes from red to green and the acceleration of the second vehicle and an increase in a distance between the first vehicle and the second vehicle does not occur (Neubecker, Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., determines if the distance between the vehicle and the second vehicle does not increase when the traffic light changes from red to green); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change is the other vehicles are failing to move (i.e., the second vehicle is not moving and preventing the first vehicle from accelerating from a stopped state) upon a light change (i.e., the vehicle determines an issue exists with the second vehicle when the distance between the vehicle and the front vehicle does not increase and when the second vehicle does not accelerate from a stopped state when the traffic light changes from red to green)”.
Regarding claim 5, the combination of Silver and Neubecker teaches all the limitations of claim 3 above, and further teaches “determining that an issue exists with the second vehicle when the traffic light changes from red to green and an acceleration of the second vehicle does not change (Silver, Col. 9 lines 58-67 teaches autonomous vehicle 100 detects that traffic light 536 is currently showing a red light and determines (second) vehicle 520 to be in a short-term stationary state, but when traffic light 536 turns green (changes from red to green) and vehicle 520 remains stationary for some predetermined length of time (acceleration of second vehicle does not change), autonomous vehicle 100 updates the designation of vehicle 520 to a long-term stationary state (determines an issue exists with the second vehicle)) (Neubecker, Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., determines if the second vehicle accelerates from a stopped state to a moving state); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change is the other vehicles are failing to move (i.e., not accelerating) upon a light change (i.e., the vehicle determines an issue exists with the second vehicle when the second vehicle does not accelerate when the traffic light changes from red to green))”.
Regarding claim 7, the combination of Silver and Neubecker teaches all the limitations of claim 1 above, and further teaches “wherein the first vehicle is static (Neubecker, Par. [0055] lines 1-2 teaches a vehicle determining that it is stopped at a light)”.
Regarding claim 10, the combination of Silver and Neubecker teaches all the limitations of claim 1 above, and further teaches “wherein the first object is a traffic light (Silver, Fig. 6 traffic light 536)”.
Regarding claim 16, Silver teaches “A non-transitory computer readable medium having stored thereon instructions, which when executed by a processor cause the processor to execute a method (Col. 2 lines 29-33 teaches non-transitory, tangible computer-readable storage medium on which computer readable instructions of a program are stored, the instructions, when executed by a processor, cause the processor to perform a method), the method comprising: receiving first image data from a first sensor of a first vehicle (Col. 1 lines 18-20 teaches the vehicle perceives and interprets its surroundings using cameras); identifying a first object based on the first image data (Col. 1 lines 45-47 teaches identifying one or more traffic control factors (objects) relating to a first vehicle); determining that an aspect of the first object has changed; determining that an issue exists when [parameters] does not change based on the change of the aspect of the first object, wherein the determining that the issue exists consists of, based upon the first image data, inferring a failure of an expected causal relationship between the change of the aspect of the first object and [the parameter] of the second object; and sending an instruction to the first vehicle to complete autonomously an operation in response to the issue (Col. 2 lines 6-13 teaches detecting, by one or more computing devices (second sensor), that a (second) vehicle has remained stationary for a predetermined period of time; and changing the determination for the (second) vehicle from the first stationary state to the second stationary state (determines issue exists) based on the (second) vehicle having remained stationary for the predetermined period of time (sensor data exceeds threshold); Col. 7 lines 2-5 teaches the perception system 172 includes one or more cameras (first sensor which produces first image data); Col. 9 lines 58-67 teaches autonomous vehicle 100 detects that traffic light 536 (first object) is currently showing a red light and determines vehicle 520 to be in a short-term stationary state, but when traffic light 536 turns green (first object changes) and vehicle 520 remains stationary for some predetermined length of time (parameters (acceleration and displacement of vehicle 520) does not change based on a change of the aspect of the first object), autonomous vehicle 100 updates the designation of vehicle 520 to a long-term stationary state (determines an issue exists by inferring a failure of the expected causal relationship of vehicle movement and a change of the traffic light when the vehicle 520 fails to move when the traffic light 536 changes from red to green and))”, however Silver does not explicitly teach “in response to determining that the aspect of the first object changed, determining if there is a change of an acceleration of a second object and a distance traveled by the second object over a period of time using the first sensor of the first vehicle”; the parameter corresponding to the change of the a the first object as “the acceleration of the second object and the distance traveled by the second object over the period of time”; and “where the first vehicle autonomously conducts the operation to address the issue in response to the instruction”.
From the same field of endeavor, Neubecker teaches “in response to determining that the aspect of the first object changed, determining if there is a change of an acceleration of a second object and a distance traveled by the second object over a period of time using the first sensor of the first vehicle”; the parameter corresponding to the change of the a the first object as “an acceleration of a second object and distance traveled by the second object over the period of time”; and “where the first vehicle autonomously conducts the operation to address the issue in response to the instruction (Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., measuring if the second vehicle acceleration changes from a first value 0 in a previous stopped state at a red light to a second value greater than 0 in a current moving state at a green light and if the distance traveled by the front vehicle changes from zero when the traffic light changes from red to green); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change is the other vehicles are failing to move (i.e., the second vehicle acceleration does not change and the distance traveled by the front vehicle has not changed from zero) upon a light change (i.e., the vehicle determines an issue exists when the second vehicle acceleration does not change from a first value 0 to a second value greater than 0 and when a distance travelled by the front vehicle does not change from zero when the traffic light changes from red to green, and conducts an operation to generate an alert to address the issue and notify the second vehicle of the light change))”.
It would have been obvious to one of ordinary skill in the art to modify the teachings of Silver to incorporate the teachings of Neubecker with a reasonable expectation of success to include in the method taught by Silver determining if there is a change of acceleration of the front vehicle and a distance traveled by the front vehicle in response to the aspect of the first object changing as taught by Neubecker, to have the parameter taught by Silver be an acceleration of the second vehicle and a distance traveled by the second vehicle as taught by Neubecker, and to have the first vehicle taught by Silver autonomously conduct the operation to address the issue in response to the instruction as taught by Neubecker.
The motivation for doing so would be to allow for notification of a distracted driver (Neubecker, Par. [0060] lines 4-5).
Regarding claim 17, the combination of Silver and Neubecker teaches all the limitations of claim 16 above, and further teaches “wherein the method further includes: identifying the second object based on the first image data (Silver, Col. 3 lines 15-17 teaches autonomous vehicle detects a stationary vehicle (second object) using data from a camera); and determining that the issue exists with the second object when the acceleration of the second object and the distance traveled by the second object over the period of time associated with the second object do not change based on the change of the first object (Silver, Col. 9 lines 58-67 teaches autonomous vehicle 100 detects that traffic light 536 (first object) is currently showing a red light and determines vehicle 520 (second object)to be in a short-term stationary state, but when traffic light 536 turns green (first object changes) and vehicle 520 remains stationary for some predetermined length of time (parameters associated with the second object (acceleration and displacement of vehicle 520) corresponding to a change of the first object do not change), autonomous vehicle 100 updates the designation of vehicle 520 to a long-term stationary state (determines an issue exists with the second object)) (Neubecker, Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., measuring if the acceleration of the second vehicle changes from 0 in a previous stopped state at a red light to a positive value in a current moving state at a green light and if the distance traveled by the front vehicle changes from zero when the traffic light changes from red to green); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change is the other vehicles are failing to move (i.e., the second vehicle acceleration does not change and the distance traveled by the front vehicle has not changed from zero) upon a light change (i.e., the vehicle determines an issue exists when the second vehicle acceleration does not change and when a distance travelled by the front vehicle does not change from zero when the traffic light changes from red to green, and conducts an operation to generate an alert to address the issue and notify the second vehicle of the light change))”.
Regarding claim 18, the combination of Silver and Neubecker teaches all the limitations of claim 17 above, and further teaches “wherein the first object is a traffic light and the second object is a second vehicle (Fig. 6 traffic light 536 (first object), vehicle 520 (second object))”.
Regarding claim 19, the combination of Silver and Neubecker teaches all the limitations of claim 18 above, and further teaches “wherein the method further includes: determining that the issue exists with the second vehicle when the traffic light changes from red to green and a distance between the first vehicle and the second vehicle does not increase and the second vehicle does not accelerate away from the first vehicle (Neubecker, Par. [0055] lines 1-2 teaches a (first) vehicle determines that it is stopped at a light; Par. [0058] lines 1-2 and 7-9 teaches when the light turns green, the process determines if the vehicle (second vehicle) immediately in front of the present vehicle is moving (i.e., determines if the distance between the vehicle and the front vehicle does not increase and if the second vehicle does not accelerate from a stopped state to a moving state away from the first vehicle); Par. [0060] lines 1-3 teaches the vehicle can alert other vehicles to a light change is the other vehicles are failing to move (i.e., the distance between the second vehicle and the stationary first vehicle is not increasing and the second vehicle is not accelerating away from the first vehicle) upon a light change (i.e., the vehicle determines an issue exists with the second vehicle when the distance between the vehicle and the front vehicle does not increase and when the second vehicle does not accelerate away from the first vehicle when the traffic light changes from red to green))”.
Regarding claim 20, the combination of Silver and Neubecker teaches all the limitations of claim 18 above, and further teaches “wherein the method further includes: determining that an issue exists with the second vehicle when the traffic light changes from red to green (Silver, Col. 9 lines 58-67 teaches autonomous vehicle 100 detects that traffic light 536 is currently showing a red light and determines (second) vehicle 520 to be in a short-term stationary state, but when traffic light 536 turns green (changes from red to green) and vehicle 520 remains stationary for some predetermined length of time, autonomous vehicle 100 updates the designation of vehicle 520 to a long-term stationary state (determines an issue exists with the second vehicle))”.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Silver et al. (US 9,558,659 B1) in view of Neubecker et al. (US 2017/0243481 A1) and further in view of Salomonsson et al. (US 2016/0232414 A1).
Regarding claim 9, the combination of Silver and Neubecker teaches all the limitations of claim 1 above, however the combination of Silver and Neubecker does not explicitly teach “providing a notification to a user of the first vehicle based on the determination that the issue exists”.
From the same field of endeavor, Salomonsson teaches “providing a notification to a user of the first vehicle based on the determination that the issue exists (Par. [0015] lines 8-18 teaches the system determines when the equipped vehicle (first vehicle) is at a traffic light (first object) and when another vehicle is ahead of the equipped vehicle, and when the light changes from red to green (first object changes) and the leading vehicle moves a threshold distance and the equipped vehicle is not moving (i.e., determines an issue exists), the system generates an alert signal)”.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the disclosed invention to modify the teachings of the combination of Silver and Neubecker to incorporate the teachings of Salomonsson to include in the method taught by the combination of Silver and Neubecker providing a notification to the vehicle user when an issue is determined to exist as taught by Salomonsson.
The motivation for doing so would be prevent the driver from being slow to start moving when a traffic signal changes to a green light due to inattention (Salomonsson, Par. [0014] lines 11-13).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHERINE M FITZHARRIS whose telephone number is (469)295-9147. The examiner can normally be reached 7:30 am - 6:00 pm M-Th.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, CHRISTIAN CHACE can be reached on (571)272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/K.M.F./Examiner, Art Unit 3665
/CHRISTIAN CHACE/Supervisory Patent Examiner, Art Unit 3665