Detailed Action
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Acknowledgements
2. Applicant’s arguments/remarks, filed on 01/16/2026, are acknowledged. Amended claim 1 and previously unexamined claims 9-11 are acknowledged. Claims 1-11 remain pending and have been examined.
Response to Arguments
3. Applicant’s arguments, see pages 1-3, filed 01/16/2026, with respect to the rejection of claim 1 under 35 U.S.C. 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn.
However, upon further consideration, a new ground of rejection under 35 U.S.C. 102 (a)(1) is made in view of Balasubramanian et al.; US, 10,771,669 B1.
With regards to Balasubramanian, on pages 3-4, Applicant states that the reference doesn’t teach to require a system clock which both controls rotational movement of a LIDAR and the capturing of images. As such, the reference does not teach “controlling a rotational movement of the rotational device using the system clock, the rotational movement synchronized with the capturing of the image data via the system clock”.
4. Response
Examiner respectfully disagrees for the following reasons:
[0045], of the instant application, states…”the rotational scanning movement of the LIDAR is synchronized with the masterclock of the system, such as the SDS's precision time protocol (“PTP”) grandmaster. It is also advantageous to synchronize the shutter control of at least one of the cameras of the AV with the masterclock of the system. In addition, [0046] states…”the LIDAR timing or clock can be synchronized to PTP. Similarly, the camera shutter signal may be synchronized to the SDS's PTP. Thus, the images captured via the cameras can be in sync with the LIDAR captured representation, thereby improving the combination of LIDAR data with camera captured images”.
The prior art, in column 7 (lines 1-16), similarly teaches “…control component 202 may cause the lidar sensor 204 to produce lidar data by transmitting a PPS+NMEA signal to the lidar sensor 204 to indicate the current master time of the multi-sensor environment 200 to the lidar sensor 204”. Column 9 (lines 56-67) teaches “…control component 202 derives a then-current master time as a threshold operation for causing a sensor to produce sensor data. The control component 202 then transmits a signal (e.g., using a PPS+NMEA signal, a PTP handshake, a CAN trigger, a TTL trigger, or the like), which signal may in some cases include the master time, to a sensor to cause it to produce the sensor data…the sensor receives the master time from the control component 202 , determines a time according to its own sensor clock at which it produced sensor data, and then adjusts that time based on the master time to determine the master time at which it produced the sensor data…”.
Further, [0056] teaches,…” a signal transmitted to the camera 206 to cause it to produce camera data may include an indication of the master time…”.
Further, in column 16 (lines 37-56) teaches…”The lidar data, which may, for example, be transmitted as a UDP packet, can be used to determine when the lidar sensor will be rotated in a certain direction next. For example, based on the rotation speed of a lidar sensor (e.g., 10 Hz), the lidar data can be used to determine a master time at which the lidar sensor will be angled 0 degrees, 90 degrees, 180 degrees, or 270 degrees with respect to an aspect of the multi-sensor environment (e.g., a front side thereof) or a field of view of another sensor (e.g., a visible-light camera)…”. [0034] teaches time synchronization with respect to a master clock; wherein sensor data may be determined to correspond to a common event wherein the system may adjust position or motion.
Therefore, the Examiner believes the limitations in question have been met by the referenced prior art sited.
5. Unexamined claims 9-11 are addressed below.
Claim Rejections - 35 USC § 102
6. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
7. Claims 1-11 are rejected under 35 U.S.C. 102 (a)(1) as being anticipated by Balasubramanian et al. (US, 10,771,669 B1).
8. Regarding claim 1, a method of controlling a rotational imaging device, the method comprising:
capturing imaging data from a sensor in the imaging device using a system clock (…Balasubramanian, in column 9 (line 55-67)-10 (1-20), teaches a controller 202 which derives a master time as a threshold operation whereby sensor data may be produced; as such a signal including the master time is transmitted (by controller 202, using e.g., a PTP handshake) to a camera whereby the camera may produce camera data; a computer 214 receives the signal and determines a sensor time for camera data production and based on the sensor time a master time is determined; thereafter computer 214 transmits a signal indicating the master time for the camera data to controller 202; Fig. 2…); and
controlling a rotational movement of the rotational imaging device using the system clock (…column 16 (lines 37-56) teaches lidar data produced by sensors is used for the determination when lidar sensor will be rotated in a certain direction; as such based on a rotation speed of a lidar sensor, lidar data can be used to determine a master time at which it is known at what angle the rotation of the sensor may be…),
the rotational movement synchronized with the capturing of the image data via the system clock (…column 16 (lines 37-56) further teaches, that the positioning of the lidar sensor can be in accordance with a field of view of visible-light camera; wherein Fig. 7 depicts time synchronization with respect to a master clock involving cameras and lidar sensor…).
9. Regarding claim 2, Balasubramanian teaches the method of claim 1 (see claim 1 above), wherein the system clock is a masterclock of the system (…wherein Fig. 7 depicts a master clock with respect to a camera system and LIDAR sensors…).
10. Regarding claim 3, the method of claim 2 (see claim 2 above), wherein the masterclock uses a precision time protocol (…wherein Balasubramanian, in column 9 (line 55-67), teaches a controller 202 which derives a master time as a threshold operation whereby sensor data may be produced; as such a signal including the master time is transmitted (by controller 202, using e.g., a PTP handshake)…).
11. Regarding claim 4, Balasubramanian teaches the method of claim 1, further comprising:
controlling a view angle of the sensor of the rotational imaging device with respect to an azimuthal angle to maintain a predetermined constant azimuthal angle (…column 16 (lines 37-56) teaches lidar data produced by sensors is used for the determination when lidar sensor will be rotated in a certain direction; as such based on a rotational speed of lidar sensor lidar data is used to determine a master time at which the lidar sensor will be angled at a particular degree ;0 degrees-270 degrees (which may be viewed as examples of azimuth angles…).
12. Regarding claim 5, Balasubramanian teaches the method of claim 1 (see claim 1 above), further comprising:
controlling a rotational scan speed of the rotational imaging device to minimize positional drift in the view angle (…wherein column 16 (lines 28-36) teaches that the computer may determine a phase offsets for the cameras based on angles of the lidar sensors; wherein as stated previously based on rotation speed of a lidar sensor, lidar data is used to determine a master time at which lidar sensors will be angled at a particular position with respect to a field of view of a camera…).
13. Regarding claim 6, Balasubramanian teaches the method of claim 4 (see claim 4 above), further comprising:
Controlling an angular position of the rotational imaging device to minimize a positional drift in the view angle (…wherein column 16 (lines 28-36) teaches that the computer may determine a phase offsets for the cameras based on angles of the lidar sensors; wherein as stated previously based on rotation speed of a lidar sensor, lidar data is used to determine a master time at which lidar sensors will be angled at a particular position with respect to a field of view of a camera…).
14. Regarding claim 7, Balasubramanian teaches the method of claim 1, further comprising:
dividing the imaging data into at least two parts (…Balasubramanian, in column 16 (lines 37-65), teaches determining when a lidar sensor will be angled 0 degrees, 90 degrees, 180 degrees, or 270 degrees with respect to an aspect of the multi-sensor environment; wherein the multi-sensor environment may include a camera having a field of view corresponding to each of those lidar sensor angles. Further, a component 712 can cause cameras of the system to produce sensor data based on the angles of the lidar sensor, receiving a UDP packet from a sensor 702 indicating angle position at one angle with respect to camera 706 and at a different angle position at another angle position with respect to camera 706. Thus, it may be said that imaging data is divided by angle position of sensing devices…).
15. Regarding claim 8, Balasubramanian teaches the method of claim 7, wherein the dividing the imaging data comprises:
grouping the imaging data parts according to an angular position or range of a scan at which the data parts were acquired (…Balasubramanian, in column 16 (lines 37-65), teaches determining when a lidar sensor will be angled 0 degrees, 90 degrees, 180 degrees, or 270 degrees with respect to an aspect of the multi-sensor environment; wherein the multi-sensor environment may include a camera having a field of view corresponding to each of those lidar sensor angles. Further, a component 712 can cause cameras of the system to produce sensor data based on the angles of the lidar sensor, receiving a UDP packet from a sensor 702 indicating angle position at one angle with respect to camera 706 and at a different angle position at another angle position with respect to camera 706. Thus, it may be said that imaging data is divided by angle position of sensing devices…).
16. Regarding claim 9, Balasubramanian teaches the method of claim 1 (see claim 1 above), wherein controlling the rotational movement of the rotational imaging device to be synchronized with the capturing of the image data via the same system clock (see claim 1) includes:
a shutter of a sensor (…wherein column 16 (lines 1-12) teaches a camera system receives signals from a component 712 (which includes a master clock; Fig. 7), which control the cameras to produce data and the timing thereof; wherein e.g. sensor data as such the opening of a camera shutter…)and
the rotational movement of the rotational imaging device to be in sync using the same system clock (…column 16 (lines 37-56) teaches lidar data produced by sensors is used for the determination when lidar sensor will be rotated in a certain direction; as such based on a rotation speed of a lidar sensor, lidar data can be used to determine a master time at which it is known at what angle the rotation of the sensor may be…).
17. Regarding claim 10, Balasubramanian teaches the method of claim 1 (see claim 1 above), wherein controlling the rotational movement of the rotational imaging device to be synchronized with the capturing of the image data via the same system clock (see claim 1) includes controlling at least one of
a rotational speed (…column 16 (lines 37-56) teaches lidar data produced by sensors is used for the determination when lidar sensor will be rotated in a certain direction; as such based on a rotation speed of a lidar sensor, lidar data can be used to determine a master time at which it is known at what angle the rotation of the sensor may be…).
18. Regarding claim 11, Balasubramanian teaches the method of claim 1 (see claim 1 above), wherein controlling the rotational movement of the rotational imaging device to be synchronized with the capturing of the image data via the same system clock includes both
a rotational speed and an angular position of the sensor (…column 16 (lines 37-56) teaches lidar data produced by sensors is used for the determination when lidar sensor will be rotated in a certain direction; as such based on a rotation speed of a lidar sensor, lidar data can be used to determine a master time at which it is known at what angle the rotation of the sensor may be…).
Conclusion
19. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SURAFEL YILMAKASSAYE whose telephone number is (703)756-1910. The examiner can normally be reached Monday-Friday 8:30am-5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SURAFEL YILMAKASSAYE/Examiner, Art Unit 2639
/TWYLER L HASKINS/Supervisory Patent Examiner, Art Unit 2639