DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer .
Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 11,526,711 B1. Although the claims at issue are not identical, they are not patentably distinct from each other because application claim 1 is anticipated by patent claim 1.
Instant app. 19/055,684
Exemplary Claim 1
US Patent No. 11,526,711 B1
Exemplary claim 1. A computer-implemented method for synchronizing road segment parameters with images of a road segment to facilitate improved analysis of information about the road segment, the computer-implemented method comprising: time-synchronizing, by one or more processors, a set of images and a plurality of road segment parameters according to a common clock based on an occurrence of a common event represented in the set of images and the plurality of road segment parameters; storing, by the one or more processors, the set of images and the plurality of road segment parameters to be referenceable by a set of timestamps; and analyzing, by the one or more processors, the set of images and the plurality of road segment parameters to identify one or more vehicle events, each vehicle event characterized by a subset of the set of images and a subset of the plurality of road segment parameters.
2. The computer-implemented method of claim 1, further comprising: timestamping, by the one or more processors, the set of images according to a first clock associated with an image sensor and the plurality of road segment parameters according to a second clock associated with one or more vehicle sensors or one or more infrastructure devices disposed proximate to the road segment.
3. The computer-implemented method of claim 1, wherein detecting the plurality of road segment parameters comprises detecting both a first set of road segment parameters collected via one or more vehicle sensors and detecting a second set of road segment parameters collected via one or more infrastructure devices disposed proximate to the road segment.
1. A computer-implemented method for synchronizing road segment parameters with images of a road segment to facilitate improved analysis or display of information about the road segments, the method comprising, via one or more local or remote processors, servers, transceivers, and/or sensors: (A) capturing, via an image sensor, a set of images of a road segment during a time-period; (B) detecting a plurality of road segment parameters associated with the road segment during the time-period, wherein the plurality of road segment parameters includes: (i) a first set of road segment parameters collected via one or more vehicle sensors and (ii) a second set of road segment parameters collected via one or more infrastructure devices disposed proximate to the road segment; (C) timestamping the set of images according to a first clock associated with the image sensor and the plurality of road segment parameters according to a second clock associated with the one or more vehicle sensors or the one or more infrastructure devices; (D) time-synchronizing the set of images and the plurality of road segment parameters according to a common clock based on an occurrence of a common event represented in the set of images and the plurality of road segment parameters; (E) storing the set of images and the plurality of road segment parameters as a set of stored data such that the set of images, the first set of road segment parameters, and the second set of road segment parameters are linked to a set of timestamps by which the set of images and the plurality of road segment parameters are referenceable; and (F) performing one or more of: (i) displaying the set of images and the plurality of road segment parameters according to a chronological order determined based upon the set of timestamps, or (ii) analyzing the set of images and the plurality of road segment parameters to identify one or more vehicle events, each characterized by a subset of the set of images and a subset of the plurality of road segment parameters.
4. The computer-implemented method of claim 1, wherein the plurality of road segment parameters comprises a parameter representing a position, a heading, or a speed of a vehicle in which one or more vehicle sensors are disposed.
3. The computer-implemented method of claim 1, wherein the first set of road segment parameters comprises a parameter representing a position, a heading, or a speed of a vehicle in which the one or more vehicle sensors are disposed.
5. The computer-implemented method of claim 1, wherein the plurality of road segment parameters includes a first position for a first vehicle and a second position for a second vehicle, and wherein the computer-implemented method further comprises calculating a distance between the first vehicle and the second vehicle based upon the first position and the second position.
4. The computer-implemented method of claim 1, wherein the first set of road segment parameters includes a first position for a first vehicle and a second position for a second vehicle, and wherein the method further comprises calculating a distance between the first and second vehicles based upon the first and second positions.
6. The computer-implemented method of claim 1, wherein the plurality of road segment parameters comprises a health status parameter representing a degree of health for an infrastructure component.
5. The computer-implemented method of claim 1, wherein the second set of road segment parameters comprises a health status parameter representing a degree of health for an infrastructure component.
7. The computer-implemented method of claim 1, wherein the plurality of road segment parameters comprises an operational status of an infrastructure component.
6. The computer-implemented method of claim 1, wherein the second set of road segment parameters comprises an operational status of an infrastructure component.
8. The computer-implemented method of claim 7, wherein the infrastructure component is a traffic signal and wherein the operational status indicates that a streetlight was green, yellow, or red at a given time.
7. The computer-implemented method of claim 6, wherein the infrastructure component is a traffic signal and wherein the operational status indicates that the streetlight was green, yellow, or red at a given time.
9. The computer-implemented method of claim 1, wherein the plurality of road segment parameters is collected by a radar device or a motion sensing device.
8. The computer-implemented method of claim 1, wherein the infrastructure device is a radar device or a motion sensing device.
10. The computer-implemented method of claim 1, wherein the plurality of road segment parameters includes one or more parameters indicating an atmospheric condition at the road segment.
9. The computer-implemented method of claim 1, wherein the second set of road segment parameters includes one or more parameters indicating an atmospheric condition at the road segment.
11. The computer-implemented method of claim 1, further comprising: capturing, via an image sensor, the set of images of a road segment during a time-period; and detecting, by one or more processors, the plurality of road segment parameters associated with the road segment during the time-period.
2. The computer-implemented method of claim 1, wherein detecting the plurality of road segment parameters comprises detecting both the first set of parameters and detecting the second set of parameters.
12. The computer-implemented method of claim 2, wherein time-synchronizing the set of images and the plurality of road segment parameters comprises: analyzing the set of images to identify: (i) at least one image corresponding to the occurrence of the common event, and (ii) a first time period, according to the first clock associated with an image sensor, during which the common event took place; analyzing the plurality of road segment parameters to identify: (i) at least one road segment parameter in the plurality of road segment parameters corresponding to the occurrence of the common event, and (ii) a second time period, according to the second clock associated with the one or more vehicle sensors or the one or more infrastructure devices, during which the common event took place; and setting the set of timestamps for the plurality of road segment parameters according to the common clock such that a first subset of the set of timestamps for the at least one image matches a subset of the set of timestamps for the plurality of road segment parameters with respect to a time at which the common event took place.
10. The computer-implemented method of claim 1, wherein time-synchronizing the set of images and the plurality of road segment parameters comprises: analyzing the set of images to identify: (i) at least one image corresponding to the occurrence of the common event, and (ii) a first time period, according to the first clock associated with the image sensor, during which the common event took place; analyzing the plurality of road segment parameters to identify: (i) at least one road segment parameter in the plurality of road segment parameters corresponding to the occurrence of the common event, and (ii) a second time period, according to the second clock associated with the one or more vehicle sensors or the one or more infrastructure devices, during which the common event took place; and setting the set of timestamps for the plurality of road segment parameters according to the common clock such that a first subset of the set of timestamps for the one or more images matches a subset of the set of timestamps for the second set of road segment parameters with respect to a time at which the common event took place.
Claims 13-19 list all similar elements of claims 1-6 and 12, but in system form rather than method form. Therefore, the supporting rationale of the rejection to claims 1-6 and 12 applies equally as well to claims 13-19.
Claim 20 list all similar elements of claim 1, but in a tangible machine-readable medium form rather than method form. Therefore, the supporting rationale of the rejection to claim 1 applies equally as well to claim 20.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIEL T TEKLE whose telephone number is (571)270-1117. The examiner can normally be reached Monday-Friday 8:00-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Vaughn can be reached at 571-272-3922. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANIEL T TEKLE/Primary Examiner, Art Unit 2481