Prosecution Insights
Last updated: April 19, 2026
Application No. 17/390,402

SURVEY DEVICE, SYSTEM AND METHOD

Final Rejection §102§103
Filed
Jul 30, 2021
Examiner
CHILTON, CLARA GRACE
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Clearedge3D Inc.
OA Round
3 (Final)
56%
Grant Probability
Moderate
4-5
OA Rounds
3y 12m
To Grant
67%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
31 granted / 55 resolved
+4.4% vs TC avg
Moderate +11% lift
Without
With
+10.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 12m
Avg Prosecution
43 currently pending
Career history
98
Total Applications
across all art units

Statute-Specific Performance

§101
1.4%
-38.6% vs TC avg
§103
58.1%
+18.1% vs TC avg
§102
23.4%
-16.6% vs TC avg
§112
15.6%
-24.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 55 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 02/13/2026 have been fully considered but they are not persuasive. Applicant argues Best does not teach obtaining a virtual model or a scene model. Instead, Best only teaches using an IMU to determine geodetic position of an initial position in [0027]. Examiner respectfully disagrees. Firstly, examiner notes the cited paragraph 61 does not recite what applicant quotes in the arguments (pg 9, definition of scene model). Instead, this is described in [0016]. In arguments and the specification, Applicant defines a scene model as a virtual model which describes the geometry (shape and physical dimensions) of a scene. Best teaches, at [0030]-[0031] and [0040], stereo imaging techniques. These produce a 3D image of the scene, which would include 3D geometry of the surroundings (such as measurements of scale and depth). Also, as the two images from each camera must be combined, this would be a virtual image as it is a combination of multiple acquired images. Further, examiner notes the above definition of a scene model is only in the specification, and not the claims. All the claims recite is that a scene model is obtained which corresponds to an initial set of measurement data. As discussed in MPEP 2111.01.II, BRI and plain meaning does not require further limitations from the specification to be ‘imported’ into the claims. The limitation of a scene model being a virtual model which describes the geometry of a scene is only in the specification. Thus, the only limitation in the independent claims is “a scene model” which would be interpreted, under BRI, as any 3D image of the scene (although examiner notes any 3D image of the scene would automatically capture the geometry due to the definition of a 3D image). Thus, this argument is not persuasive. Applicant argues Best does not teach determining the location of the survey device relative to the scene model based on measurement data, as Best fails to teach a scene model. Examiner respectfully disagrees. As this is based on the scene model, see argument above. Applicant argues Best fails to teach updating the location of the survey device relative to the scene model, as Best fails to teach a scene model. Examiner respectfully disagrees. As this is based on the scene model, see argument above. Applicant argues Best fails to teach a processor configured to provide information, based on […] data, that is useable to move the indicator towards a location. Instead, applicant argues Best only teaches the calculation of a position change after movement in [0029]. Examiner respectfully disagrees. A recitation of the intended use of the claimed invention must result in a structural difference between the claimed invention and the prior art in order to patentably distinguish the claimed invention from the prior art. If the prior art structure is capable of performing the intended use, then it meets the claim. “Useable to” indicates intended use, and a system which maps out an area could be used to determine a position to move an indicator to. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1 ,2, 4-7, 9, 11-15, 17, and 18 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Best (US 20210190969 A1). Claim 1: Best teaches a system, comprising: a survey device comprising a support and a sensor attached to the support (Fig. 2, system 200, cameras 212, pole 208), the sensor configured to capture measurement data ([0029]); and at least one processor coupled to the sensor to receive the measurement data (Fig 2, processors 216), wherein the at least one processor is configured to: obtain a scene model corresponding to an initial set of the measurement data captured by the sensor when the support is located at an initial position (Fig. 1, initial position 112 and [0027]), determine a location of the survey device relative to the scene model, based on the initial set of the measurement data and the scene model ([0025]). and update the location of the survey device relative to the scene model, based on subsequent sets of the measurement data captured by the sensor when the support is located at corresponding subsequent positions (Fig. 1, measurement position 104 and [0029]), wherein the support comprises an indicator (Fig. 2, first end 218 and [0041] - using end to measure tilt), the sensor and the indicator have a known or determinable spatial relationship ([0024]), and the at least one processor is configured to provide information, based on the measurement data and the spatial relationship between the sensor and the indicator, that is usable to move the indicator towards a location ([0029]). Claim 2: Best teaches the system of Claim 1, wherein the at least one processor is configured to receive the scene model, compute a transform to match at least one of (i) the initial set or (ii) at least one of the subsequent sets of the measurement data to the scene model, and apply the transform to compute the location of the survey device corresponding to the at least one of (i) the initial set or (ii) the at least one of the subsequent sets relative to the scene model ([0031] – using dead reckoning to compute change in position). Claim 4: Best teaches the system of claim 1, wherein the at least one processor is configured to generate the scene model based on the initial set of the measurement data captured by the sensor when the support is located at the initial position ([0026]). Claim 5: Best teaches the system of claim 1, wherein at least one of (i) the initial position or (ii) at least one of the subsequent positions has a known location ([0023] - first location can receive GNSS signals). Claim 6: Best teaches the system of claim 1, wherein the at least one processor is configured to update the scene model based on the measurement data ([0026] - initial position calculation). Claim 7: Best teaches the system of claim 1, wherein the indicator the indicator is a tip on the support (Fig. 2, end 218 on tip of rod 208). Claim 9: Best teaches the system of claim 1, wherein the support is a rod (Fig. 2, pole 208), the indicator is on an end portion of the rod (Fig. 2, end 218), and the survey device comprises at least one of: the at least one processor supported by the rod, a display supported by the rod, and configured to display the location of the survey device relative to the scene model, a further sensor supported by the rod, and configured to track the location of the survey device by using dead reckoning, or a prism supported by the rod, and configured to communicate with for location determination of the prism by a total station, and the survey device being configured to obtain the total station determined prism location (Fig. 2, processor 216 and IMU 224 and [0031]). Claim 11: As Claim 11 is a method claim corresponding to Claim 1, see rejection above. Claim 12: Best teaches the method of claim 11, wherein in said capturing the measurement data, the measurement data are captured when the support and the sensor are moved while keeping the indicator stationary ([0027] - measuring tilt implies pole tilts while end remains stationary). Claim 13: Best teaches the method of claim 11, wherein the initial position is previously marked in the scene, or the location of the initial position is determined by further survey equipment interacting with the survey device having the indicator placed at the initial position ([0026] - using GNSS data). Claim 14: Best teaches the method of claim 11, wherein said obtaining the scene model comprises receiving the scene model, and mapping the measurement data captured when the indicator is at the initial position to the received scene model, and said localizing comprises determining a location of the survey device relative to the scene model based on the mapping ([0026]). Claim 15: Best teaches the method of claim 11, wherein said obtaining the scene model comprises generating the scene model based on an initial set of the measurement data captured by the sensor when the indicator is located at the initial position ([0026] - initial measurement). Claim 17: Best teaches the method of claim 11, wherein in said capturing the measurement data, the measurement data is generated using at least one of laser scanning, image capturing, or echolocation ([0024] - and Fig. 2, camera 212). Claim 18: Best teaches the method of claim 11, further comprising: periodically capturing, at a first interval and by a further sensor attached to the support, information indicating a location of the survey device as the survey device is moving around the scene; periodically capturing, at a second interval greater than the first interval and by the sensor, the measurement data; and updating the location of the survey device by matching the measurement data periodically captured by the sensor with the scene model, wherein said matching uses the information periodically captured by the further sensor as an estimate of the location of the survey device ([0029] - using IMU and camera). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over Best (US 20210190969 A1) in view of Wang (US 20120163656 A1). Claim 2: Best teaches the system of Claim 2. Best does not teach, but Wang does teach wherein the transform comprises at least one of a non-linear transform, translation, rotation, scaling, or shearing ([0175]). It would have been obvious before the effective filing date to use the transform, as taught by Wang, in the system as taught by Best, because different types of transforms are mathematical methods and equations, and thus would be known in the art and yield predictable results (as the results of such equations are well known). Claim 10: Best teaches the system of claim 1, wherein the support is a rod (Fig. 2, pole 208), the indicator is on an end portion of the rod (Fig. 2, end 218), and the survey device further comprises […] the measurement data being only captured directly from the sensor ([0029]), an Inertial Measurement Unit (IMU) supported by the rod (Fig. 2, IMU 224), coupled to the at least one processor, and configured to track the location of the survey device by using dead reckoning ([0031]). Best does not teach, but Wang does teach, wherein the sensor comprises a Light Detection and Ranging (LIDAR) scanner ([0068]). It would have been obvious before the effective filing date to use LiDAR, as taught by Wang, in the system as taught by Bess, because LiDAR is a well known technique in the surveying arts. Claims 8 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Best (US 20210190969 A1) in view of Ichiriyama (US 20230003527 A1). Claim 8: Best teaches the system of claim 1, wherein the at least one processor is configured to determine a current position of the indicator based on the location of the survey device and the known or determinable spatial relationship between the sensor and the indicator ([0041] - determining angle). Best does not teach, but Ichiriyama does teach obtain[ing] a location of a layout point where a construction work is to be performed, and determine and output a spatial relationship between the current position of the indicator and the layout point, the spatial relationship usable to direct a worker to the layout point to mark the layout point for the construction work (Fig 5 – note “construction work” is a broad category and would be interpreted to include performing measurements). It would have been obvious to use the method of directing workers to perform measurement at different points as taught by Ichiriyama with the system as taught by Best because, as Ichiriyama teaches, this allows for efficient cross-sectional surveying (See Ichiriyama [0025]). Claim 16: As Claim 16 is a method claim corresponding to Claim 8, see rejection above. Claims 19 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Best (US 20210190969 A1) in view of Wang (US 20120163656 A1), further in view of Ichiriyama (US 20230003527 A1). Claim 19: Best teaches a survey device, comprising: a rod having a physical indicator (Fig. 2, pole 208, end 218 and [0041] - determining angle with endpoint); […] and having a predetermined spatial relationship with the physical indicator ([0024]); and at least one of a processor supported by the rod and coupled to the LIDAR scanner, or a data interface supported by the rod (Fig. 2, processor 216), wherein at least one of the processor or the external processor is configured to localize the survey device relative to a scene model corresponding to measurement data captured by the […] scanner ([0026]). Best does not teach, but Wang does teach a Light Detection and Ranging (LIDAR) scanner rigidly attached to the rod ([0068] along with Fig. 6 – camera and such arranged on rod) , and configured to couple the LIDAR scanner to an external processor (Fig. 6, data collector 162), and to generate information, based on the measurement data and the spatial relationship between the LIDAR scanner and the physical indicator, that is usable to move the physical indicator towards a location. Best, as modified in view of Wang, does not teach, but Ichiriyama does teach generat[ing] information, based on the measurement data and the spatial relationship between the LIDAR scanner and the physical indicator, that is usable to move the physical indicator towards a location. (Fig 5). It would have been obvious to use the method of directing workers to perform measurement at different points as taught by Ichiriyama with the system as taught by Best because, as Ichiriyama teaches, this allows for efficient cross-sectional surveying (See Ichiriyama [0025]). Claim 20: Best, as modified by Wang and Ichiriyama, teaches the survey device of claim 19, further comprising: an Inertial Measurement Unit (IMU) supported by the rod, and coupled to at least one of the processor or the data interface, wherein at least one of the processor or the external processor is configured to, as the survey device is moving, use information captured by the IMU as an estimate of a location of the survey device to perform matching the measurement data captured by the LIDAR scanner with the scene model, and update the location of the survey device based on said matching ([0027], [0029], and Fig. 2, IMU 224). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CLARA CHILTON whose telephone number is (703)756-1080. The examiner can normally be reached Monday-Friday 6-2 MT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached on 571-270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CLARA G CHILTON/Examiner, Art Unit 3645 /HELAL A ALGAHAIM/SPE , Art Unit 3645
Read full office action

Prosecution Timeline

Jul 30, 2021
Application Filed
Aug 17, 2021
Response after Non-Final Action
Mar 12, 2025
Non-Final Rejection — §102, §103
Aug 31, 2025
Interview Requested
Sep 16, 2025
Examiner Interview Summary
Sep 25, 2025
Response Filed
Nov 10, 2025
Non-Final Rejection — §102, §103
Feb 13, 2026
Response Filed
Mar 20, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12566251
INTEGRATED AND COMPACT LIDAR MEASURMENT SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12523748
DETECTOR HAVING QUANTUM DOT PN JUNCTION PHOTODIODE
2y 5m to grant Granted Jan 13, 2026
Patent 12481040
LOW POWER LiDAR SYSTEM WITH SMART LASER INTERROGRATION
2y 5m to grant Granted Nov 25, 2025
Patent 12474454
SENSOR WITH CROSS TALK SUPPRESSION
2y 5m to grant Granted Nov 18, 2025
Patent 12461208
DIFFRACTIVE LIGHT DISTRIBUTION FOR PHOTOSENSOR ARRAY-BASED LIDAR RECEIVING SYSTEM
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

4-5
Expected OA Rounds
56%
Grant Probability
67%
With Interview (+10.6%)
3y 12m
Median Time to Grant
High
PTA Risk
Based on 55 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month