Prosecution Insights
Last updated: April 19, 2026
Application No. 17/975,943

TIME-OF-FLIGHT MOTION MISALIGNMENT ARTIFACT CORRECTION

Final Rejection §102§103
Filed
Oct 28, 2022
Examiner
BAGHDASARYAN, HOVHANNES
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
GM Cruise Holdings LLC
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
94%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
759 granted / 971 resolved
+26.2% vs TC avg
Strong +16% interview lift
Without
With
+16.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
85 currently pending
Career history
1056
Total Applications
across all art units

Statute-Specific Performance

§101
2.6%
-37.4% vs TC avg
§103
45.7%
+5.7% vs TC avg
§102
21.5%
-18.5% vs TC avg
§112
23.9%
-16.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 971 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 02/24/2026 have been fully considered but they are not persuasive. Prior art explicitly teach identifying frames in fig. 5 and based on those frames identify the velocities. Claim does not clarify how velocities are calculated. As long as prior art have frames all the frames from fig. 5 and using them in some way to calculate velocity the claim limitations are met. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1, 12, and 20 and claims bellow are rejected under 35 U.S.C. 102(a)(1) as being anticipated by D1 US 20160232684 A1. Regarding claims 1, 12 20 D1 teaches A time-of-flight sensor system, comprising: a receiver system (104)comprising a sensor; and a computing system(102) in communication with the receiver system, comprising: a processor(102); and memory(122) that stores computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising: receiving a stream of frames outputted by the receiver system of the time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types, and[0017](and fig. 5) wherein a frame type of a frame signifies sensor parameters of the time-of-flight sensor system when the frame is captured by the time-of- flight sensor system such that the different frame types signify different sensor parameters;(fig. 5 different time window parameters, as time windows differ the illumination can correspond to two different times of the day such as sunny and dark time ) identifying a pair of non-adjacent frames having differing relative phase delays in the stream of frames;[0038+0073](fig. 5) calculating computed optical flow data based on the pair of non-adjacent frames in the stream of frames;[0038+0073] generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data;[0038+0073]+[0081] realigning the at least one differing frame based on the estimated optical flow data;([0064-0065]) computing object depth data for an object based on realigned frames in the frame sequence;[0067] and generating transverse velocity estimate data including at least one of vertical and horizontal velocity estimate data and radial velocity estimate data ([0076, 0078], Vx and Vy are vertical and horizontal velocity estimate data and data which are part of the radial velocity and therefore can be considered radial velocity estimate data) for the object based on the computed optical flow data and the object depth data for the object. [0076-0078] 2. The computing system of claim 1, wherein the transverse velocity estimate data for the object is further generated based on an area in an environment of the time- of-flight sensor system included in a field of view of the frames.[0065](implicit as stationary points are aligned and those are the points of environment in FOV) 3. The computing system of claim 1, the acts further comprising: generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data; realigning the at least one differing frame based on the estimated optical flow data; and computing object depth data for the object based on realigned frames in the frame sequence; wherein the traverse velocity estimate data for the object is further generated based on the object depth data for the object.[0064-0065]+[0081]+[0076-0078] 4. The computing system of claim 3, wherein the set of frames in the frame sequence are captured by the time-of-flight sensor system over a period of time between 1 milliseconds and 100 milliseconds.[0071] 5. The computing system of claim 1, wherein the sensor parameters of the time- of-flight sensor system when the frame is captured comprise at least one of: an illumination state of the time-of-flight sensor system, such that the time-of- flight sensor system either emits or is inhibited from emitting light for the frame; a relative phase delay between a transmitter system and a receiver system of the time-of-flight sensor system for the frame; or(fig. 5 different phases 0-3) an integration time of the sensor of the time-of-flight sensor system for the frame. 6. The computing system of claim 1, wherein the pair of non-adjacent frames in the stream comprises successive frames of the same frame type.(fig. 5 phase image 0-5) 7. The computing system of claim 1, wherein the computed optical flow data is calculated for each pair of non-adjacent frames of the same frame type in successive frame sequences in the stream of frames.[0060] 10. The computing system of claim 1, wherein the time-of-flight sensor system comprises the computing system.(fig. 1) Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over D1. Regarding claim 9, 11 D1 does not explicitly teach 9. The computing system of claim 1, wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive frames having relative phase delays that are 180 degrees out of phase. But algorithm is relevant for any phase delay and therefore It would be obvious to one of ordinary skills in the art at the time of filing to modify teachings by D1 in order to obtain coarse velocity estimates if small amount of the frames is available. 11. The computing system of claim 1, wherein an autonomous vehicle comprises the time-of-flight sensor system and the computing system. But teaches [0002](robotic control ) It would be obvious to one of ordinary skills in the art at the time of filing to modify teachings by D1 to use in autonomous vehicle in order to provide collision avoidance system . Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over D1 in view of D2 US 2020182984. D1 does not explicitly teach but d2 teaches 8. The computing system of claim 1, wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive passive frames for which the time-of-flight sensor system is inhibited from emitting light.[0088] It would be obvious to one of ordinary skills in the art at the time of filing to modify teachings by D1 with teaching by D2 in order to identify phase value offset. Claims 13-19 are rejected similarly as claims 2-11. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HOVHANNES BAGHDASARYAN whose telephone number is (571)272-7845. The examiner can normally be reached Mon-Fri 7am - 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Yuqing Xiao can be reached at (571) 270-3603. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HOVHANNES BAGHDASARYAN/Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Oct 28, 2022
Application Filed
Nov 24, 2025
Non-Final Rejection — §102, §103
Feb 13, 2026
Examiner Interview Summary
Feb 13, 2026
Applicant Interview (Telephonic)
Feb 24, 2026
Response Filed
Mar 13, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591059
OPTICAL RANGING DEVICE AND OPTICAL RANGING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12591047
OPTICAL SYSTEM FOR LIGHT DETECTION AND RANGING
2y 5m to grant Granted Mar 31, 2026
Patent 12585000
RECEIVING DEVICE FOR AN OPTICAL MEASUREMENT APPARATUS FOR CAPTURING OBJECTS, LIGHT SIGNAL REDIRECTION DEVICE, MEASUREMENT APPARATUS AND METHOD FOR OPERATING A RECEIVING DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12569880
CMOS ULTRASONIC TRANSDUCERS AND RELATED APPARATUS AND METHODS
2y 5m to grant Granted Mar 10, 2026
Patent 12560721
SPAD LIDAR SYSTEM WITH BINNED PIXELS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
94%
With Interview (+16.1%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 971 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month