Prosecution Insights
Last updated: April 19, 2026
Application No. 17/505,476

ELECTRONIC SURVEYING INSTRUMENT

Final Rejection §102§103
Filed
Oct 19, 2021
Examiner
RICHTER, KARA MARIE
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Leica Geosystem AG
OA Round
3 (Final)
67%
Grant Probability
Favorable
4-5
OA Rounds
4y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
10 granted / 15 resolved
+14.7% vs TC avg
Strong +42% interview lift
Without
With
+41.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
45 currently pending
Career history
60
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
47.5%
+7.5% vs TC avg
§102
31.4%
-8.6% vs TC avg
§112
16.4%
-23.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 29 December 2025 has been entered. Response to Amendment No amendments to the claims, when compared to previously filed claims, have been submitted with applicant’s response dated 29 December 2025. Claims 1-14 are pending. Response to Arguments Applicant's arguments filed 29 December 2025 have been fully considered but they are not persuasive. Applicant traverses the rejection of claims 1-5 under 35 USC § 102(a)(1) and (a)(2), and asserts that the cited prior art (Finkelstein -US 20190250257 A1) does not disclose the claimed limitation of “synchronizing the emitting of projection light and the acquiring of the sequence of images in such a way that for a first subsequence of the sequence of images a received power of the received projection light is lower during image acquisition phases than during remaining times which are not image acquisition phases.”, specifically that the disclosure is silent on the synchronizing of light emission and image detection, and that the mere inclusion of a driver circuit configured to control the timing of optical emission and may likewise control the timing of the detector array does not teach synchronizing. Examiner again respectfully notes that the application’s claims are further silent on the exact definition of “synchronizing” the emission and detection of images, and therefore the broadest reasonable interpretation (BRI) was understood from the specification and drawings. Referencing Figs. 5, 7, 8 and 9, and paragraphs [0066] – [0070] of the instant application, ‘synchronization’ was interpreted to define the relationship of acquisition and non-acquisition phases and the periods of emission of light. Examiner reasserts that priorly referenced paragraphs [0138] – [0139] and Figs. 7 and 8A of Finkelstein anticipate this BRI, as Finkelstein teaches emission windows, and a plurality of strobe windows indicating activation and deactivation of SPAD’s/detectors. Paragraph [0130] of Finkelstein specifically points out a global timing circuit, which controls emission and detection timing, where “The global timing circuit changes the state of the second transistor upon activation of the SPAD array, so that the second transistor is configured to conduct current. In some embodiments, the second transistor is turned off by the global timing circuit synchronously with the pulsed emitter.” (emphasis added). Additionally, the instant application states that synchronization “Synchronization may proceed via a shared clock, for example, or by other digital means, for example by a controller controlling the projection light source and/or the image sensor.” ([0020], [0040]) Therefore, under the BRI of synchronization and how it is implemented, the previously referenced paragraphs of Finkelstein which describe the driver circuit and global timing circuit controlling the emission and/or detection timing would read on this limitation. Applicant again notes (pgs. 7-8 of Remarks) that the prior art does not relate to a sequence of images, with both image acquisition and non-image acquisition phases occurring at a particular frame rate, and declares that a wrong conclusion has been gained from the prior art and that it “does not disclose synchronization”. Again, examiner reasserts that priorly referenced paragraphs [0138] – [0140] and Figs. 7 and 8A of Finkelstein teaches emission windows, and a plurality of strobe (detection) windows indicating activation and deactivation of SPAD’s/detectors, at particular frame rates as determined by a repetition of laser cycles interspersed with SPAD collection windows (pulse cycle in Fig. 7), where a plethora of cycles may make up a frame within the point cloud. This is not hindsight; the cited references and taking Finkelstein as a whole show a system which utilizes time bins within detection windows, where the detection windows are interspersed with emission windows, and groupings of repeated emission/detection cycles make up what Finkelstein refers to as “sub-frames”. The fact that Finkelstein includes a more intricate detection window between emissions of a laser does not preclude it from anticipating alternating emission and detection windows as this argument is functionally more restrictive than the limitations as claimed. Applicant further notes (pgs. 9 -10 of Remarks) that references made to paragraph [0209] of prior art is not related to synchronization, and that the system of Finkelstein only utilizes active and passive SPADs in parallel. Applicant also notes operating in parallel is “more efficient and/or less time consuming than a serial or alternating/sequential application/operation”. Regarding the comments pertaining to paragraph [0209], examiner again agrees that the references to paragraph [0209] are inherently not pertaining to synchronization, and further references [0130], which discusses synchronization between emission and detection circuits, and paragraphs [0180] – [0195] and [0209] – [0216] as further discussions of acquiring background frames and passive signals, respectively. Finkelstein notes this possibility, with “As an example and without loss of generality, once per a predetermined or desired number of active frames, a passive frame may replace an active frame. During the passive frame, the laser driver transistor/circuitry may not drive the laser to emit pulses.”. It would be understood by one of ordinary skill in the art at the time of filing, that a strobe cycle could be operated where a first subsequence of received projection light would be lower than other non-acquisition times, as it only includes passive acquisition SPADs. As is known in the art of LiDAR and image collection and analysis, by only using the series of SPADs which are optically isolated for a first acquisition phase the system can collect only non-optical background values, which gives yield to noise values such as dark-currents. These values can then be used to compensate later non-acquisition frames which would have passive SPADs collecting background noise data, for example. Finally, the applicant argues that operating in parallel is “more efficient and/or less time consuming than a serial or alternating/sequential application/operation”, and therefore it is argued that Finkelstein teaches away from the current application. As noted above, Finkelstein does not teach away from the possibility of a sequential operation. Additionally, the examiner respectfully notes that if operating in parallel is in fact significantly more efficient, then the instant application would not pose as an improvement on the prior art. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-5, and 10-13 is/are rejected under 35 U.S.C. 102(a)(1) and (a)(2) as being anticipated by Finkelstein (US 2019/0250257 A1). Regarding claim 1, Finkelstein anticipates a method for an electronic surveying instrument, in particular a total station, laser tracker, electronic distance meter, stake-out instrument or laser scanner, comprising: emitting of projection light, in particular laser light from a laser ([0015], [0101]; Fig. 1, emitter array (115)), in a targeted manner towards an object for indicating a targeting state to a user ([0170]), receiving ([0101]; Fig. 1, detector array (110)) at least part of the projection light reflected by the object ([0098]), acquiring a sequence of images of the object using an image sensor with a frame rate, in particular 20fps or higher ([0107]) , and an induced succession of image acquisition phases and image acquisition pauses ([0138] - [0139]; Figs. 7 and 8A, where strobe, or collection, windows occur between laser pulses and are not overlapping with laser pulse), the image sensor being sensitive for the projection light ([0178]), and synchronizing the emitting of projection light and the acquiring of the sequence of images ([0101] - [0102], [0130]; Fig. 1, where emission and detection are controlled by a timing circuit (106)) in such a way that for a first subsequence of the sequence of images a received power of the received projection light is lower during image acquisition phases than during remaining times which are not image acquisition phases ([0142], [0209] – [0216], where dark current or background photon counts can be selectively recorded without signal photos, e.g. by suspending firing for a specific frame or subframe or utilization of only passive acquisition SPADS for a first frame within a sub-sequence of frames, where a passive acquisition SPAD may further be isolated from any signal photons, and passive or partially-passive frames are used for background correction). Regarding claim 2, Finkelstein anticipates the method according to claim 1, wherein the projection light has a wavelength in the visible spectrum ([0117], [0179]; Fig. 4A, where the transmission spectrum is for a filter which allows LIDAR reflection light to reach detectors which is centered on 520 nm and Fig. 18B, which shows an integrated visible TOF-OR device as one embodiment). Regarding claim 3, Finkelstein anticipates the method according to claim 1, wherein the synchronizing is done in such a way that for at least the first subsequence the received power is zero during image acquisition phases ([0209] - [0216]) Regarding claim 4, Finkelstein anticipates the method according to claim 1, wherein a union of a nonempty second subsequence of the sequence of images and the first subsequence forms the sequence of images and difference images using images of the first and second subsequence are provided ( [0214], and n s i g = n b g + s i g - n b g is the calculated difference between the counter output during each active sub-frame ( n b g + s i g ) , and the value of the last passive sub-frame counter for the same pixel ( n b g ).). Regarding claim 5, Finkelstein anticipates the method according to claim 1, wherein the synchronizing is done in such a way that the received power in the first subsequence is adapted to a saturation level of the image sensor, in particular adapted in such a way that oversaturation is prevented ([0154], [0198] - [0199]; Fig. 14, saturation control circuits (1455) can adjust collections based on the signal and background photon count levels measured (i) in a previous cycle, (ii) in another pixel, or (ii) in both a previous cycle and another pixel.) Regarding claim 10, Finkelstein anticipates an electronic surveying instrument, in particular a total station, laser tracker, electronic distance meter, stake-out instrument or laser scanner, comprising: a projection light emitting unit, in particular a laser, ([0015], [0101]; Fig. 1, emitter array (115)) an optical emitting light path from the projection light emitting unit to an object ([0108]; Fig. 2 light is emitted by VCSEL array (215) and may pass through beam shaping emitter optics (214) before illuminating target), an optical receiving light path from the object to an image sensor ([0115]; Fig. 3) with a frame rate, in particular 20fps or higher ([0107]), wherein the image sensor is configured in such a way that after each image acquisition phase an image acquisition pause follows ([0138] - [0139]; Figs. 7 and 8A, where strobe, or collection, windows occur between laser pulses and are not overlapping with laser pulse), the image sensor being sensitive for the projection light ([0178]), wherein the projection light emitting unit and the image sensor are synchronized in such a way as to provide the method according to claim 1. Regarding claim 11, Finkelstein anticipates the electronic surveying instrument according to claim 10, further comprising a controller unit configured to provide the synchronization between the projection light emitting unit and the image sensor ([0101] - [0102]; Fig. 1, controlled by a timing circuit (106)). Regarding claim 12, Finkelstein anticipates the electronic surveying instrument according to claim 10, wherein the optical receiving light path is free of dedicated optical filters for the projection light ([0022], where some embodiments may have a tunable optical filter arranged to transmit (not block) a band based on emission wavelength of light source). Regarding claim 13, Finkelstein anticipates the electronic surveying instrument according to claim 11, wherein the projection light emitting unit comprises a chopper wheel or a liquid crystal shutter, wherein the chopper wheel or the liquid crystal shutter are controlled by the controller unit, and/or the controller unit is configured to turn the projection light emitting unit on and off ([0007], where the light source is periodically pulsed for illumination). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 6-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Finkelstein ( US 2019/0250257 A1), as applied to Claims 1 and 4 above, and further in view of Hicks (US 2020/0018854 A1). Regarding claim 6, Finkelstein does not explicitly teach, but Hicks does teach in-situ identifying and discarding images of the second subsequence from the sequence of images, wherein the identifying is done using at least the images ([0013] - [0014]) and information about the image sensor ([0059] - [0061]; Fig. 4 receiver (140) has a specific field of view which may affect image collected). Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Finkelstein to incorporate the teachings of Hicks to also identify and subsequently discard secondary images based on information within the images with a reasonable expectation of success. Finkelstein teaches controlling timing circuitry to be adjusted in the event of object identification in adjacent sub-frames, as well as determining false-positives within data collected in frames ([0225] – [0226]). In-situ identification of images in the system of Finkelstein would have a predictable result of allowing for deletion of false positive images, thereby reducing memory requirements of the system. Regarding claim 7, Finkelstein does not explicitly teach - but Hicks does teach identifying images which is based on the synchronization ([0086], [0101], [0115]-[0118], where the Lidar collects data in time-gated intervals and images can be discarded based on the image and relevant time intervals). To one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Finkelstein to incorporate the teachings of Hicks to identify secondary images based on information within the images, as well as based on synchronization information within a timing scheme, with a reasonable expectation of success. As discussed above, Finkelstein teaches controlling timing circuitry to be adjusted in the event of object identification in adjacent sub-frames, as well as determining false-positives within data collected in frames ([0225] – [0226]) and therefore identification of images based on synchronization information would have a predictable result of attaching additional information to frames for later processing, such as location indications, as noted by Hicks ([0118]). Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over Finkelstein (US 2019/0250257 A1), as applied to Claim 1 above, and further in view of Dumoulin (Dumoulin, US 11,789,127 B2). Finkelstein also teaches wherein a rolling shutter readout scheme ([0165]) is used to readout pixel values. Finkelstein fails to teach reading out specific pixels or portions of the sensor based on where incident light falls. Dumoulin teaches the sequence of images is acquired in a rolling shutter manner, and the synchronizing is at least done for rolling shutter scan lines intersecting parts of the image sensor on which the received projection light falls (Col. 9, line 62 - Col. 10, line 20). Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Finkelstein to incorporate the teachings of Dumoulin to only readout specific pixels within a range with a reasonable expectation of success. Finkelstein discusses use of a quasi-linear SPAD array, where specific pixels are readout if they detect photons within a specific timeframe ([0121]). Implementing the readout of pixels based on if the SPADs within that section of the detector array detect a photon would have predictable results of reducing SPAD recovery time, allowing for faster pulse emission or laser modulation rates. Claim(s) 9 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Finkelstein (US 2019/0250257 A1), as applied to Claims 1 and 10 above, and further in view of Grässer (US 10,168,153 B2). Regarding claims 9 and 14, Finkelstein does not explicitly teach – but Grässer teaches a display (Col. 37, lines 13-27; Fig. 13, screen display (1300)), configured to … wherein the display is in particular linked via a wireless or wired connection (Col. 52 lines 7-16) to the image sensor and/or to the controller unit (Fig. 3A, 3B, image capture device (305))….displaying the first sequence to a user of the electronic surveying instrument (Col. 25, lines 42-58; Fig. 8 step (825) display captured image). Therefore, to one of ordinary skill in the art before the effective filing date of the claimed invention, it would have been obvious prima facie to modify Finkelstein to incorporate the teachings of Grässer to display specific frames/images with a reasonable expectation of success. Finkelstein discusses embodiments which have processing units which may store and/or display results of a 3D point cloud ([0170]), and integration of the display of captured images of Grässer would predictably allow for a first sequence of images to be visible to a user for observation and/or analysis, with a predictable result where information is displayed on a physical display which is connected (either directly or wirelessly) to the detection unit. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Carlén et al. (US 20190331486 A1) teaches geodetic surveying with time synchronization, where targets are identified by obtaining a difference image between a first image and a second image, and determine a direction to the target. Mahara (US 20240019579 A1) teaches a time-of-flight sensor system, where synchronization between signals, emission and receiving/exposure times is used in processing and target detection by pulsed emission. Kaler et al. (US 20240207475 A1) teaches an application of light emission (UV) and sensors to detect objects and events in a space (for purposes of disinfection), where methods of controlling a light beam may include, for example, direction, speed, and timing of illumination and differential image frame-by-frame analysis. All claims are identical to or patentably indistinct from, or have unity of invention with claims in the application prior to the entry of the submission under 37 CFR 1.114 (that is, restriction (including a lack of unity of invention) would not be proper) and all claims could have been finally rejected on the grounds and art of record in the next Office action if they had been entered in the application prior to entry under 37 CFR 1.114. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing of a request for continued examination and the submission under 37 CFR 1.114. See MPEP § 706.07(b). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). As noted in MPEP § 706.07(b), applications in which a request for continued examination (RCE) has been filed, where the presented claims are identical to those presented prior to the entry of the submission of the RCE, may be finally rejected in the first action following the filing of the RCE. In the case of the instant application, the claims dated 29 December 2025 (those filed with the RCE) are identical to those filed on 19 October 2021 and 28 August 2025. A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Kara Richter whose telephone number is (571)272-2763. The examiner can normally be reached Monday - Thursday, 8A-5P EST, Fridays are variable. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at (571) 270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /K.M.R./Examiner, Art Unit 3645 /JAMES R HULKA/Primary Examiner, Art Unit 3645
Read full office action

Prosecution Timeline

Oct 19, 2021
Application Filed
May 22, 2025
Non-Final Rejection — §102, §103
Aug 28, 2025
Response Filed
Oct 01, 2025
Final Rejection — §102, §103
Dec 29, 2025
Request for Continued Examination
Feb 04, 2026
Response after Non-Final Action
Mar 04, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601841
FMCW HETERODYNE-DETECTION LIDAR IMAGER SYSTEM WITH IMPROVED DISTANCE RESOLUTION
2y 5m to grant Granted Apr 14, 2026
Patent 12571892
DISTANCE MEASUREMENT DEVICE AND DISTANCE MEASUREMENT METHOD
2y 5m to grant Granted Mar 10, 2026
Patent 12554018
Method of Apparatus for Determining Distance Information
2y 5m to grant Granted Feb 17, 2026
Patent 12553995
DATA REFINEMENT IN OPTICAL SYSTEMS
2y 5m to grant Granted Feb 17, 2026
Patent 12553991
LIDAR DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
67%
Grant Probability
99%
With Interview (+41.7%)
4y 0m
Median Time to Grant
High
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month