Prosecution Insights
Last updated: April 19, 2026
Application No. 18/356,871

GLOBAL OPTIMIZATION METHODS FOR MOBILE COORDINATE SCANNERS

Final Rejection §103
Filed
Jul 21, 2023
Examiner
HUNTSINGER, PETER K
Art Unit
2682
Tech Center
2600 — Communications
Assignee
Faro Technologies Inc.
OA Round
2 (Final)
28%
Grant Probability
At Risk
3-4
OA Rounds
4y 11m
To Grant
45%
With Interview

Examiner Intelligence

Grants only 28% of cases
28%
Career Allow Rate
90 granted / 322 resolved
-34.0% vs TC avg
Strong +17% interview lift
Without
With
+16.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 11m
Avg Prosecution
59 currently pending
Career history
381
Total Applications
across all art units

Statute-Specific Performance

§101
9.3%
-30.7% vs TC avg
§103
50.3%
+10.3% vs TC avg
§102
19.4%
-20.6% vs TC avg
§112
19.0%
-21.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 322 resolved cases

Office Action

§103
DETAILED ACTION Claims 13 and 20 have been canceled. Claims 1-12 and 14-19 are currently pending. The objection to the title of the invention is withdrawn due to Applicant’s amendment. The rejection to claim 8 under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph is withdrawn due to Applicant’s amendment. Response to Arguments Applicant's arguments filed 1/22/26 have been fully considered but they are not persuasive. The Applicant argues on pages 8 and 9 of the response in essence that: The above recited passages do not teach or suggest "limiting the transformation to only a horizontal plane," as recited in amended independent claims 1 and 16. For example, a translation value with vertical component and/or a rotation value with a non-zero pitch component is/are indicative of a transformation that is not limited to a horizontal plane, and Wohlfeld fails to teach or suggest that vertical components of the first and second translation values are zero and that pitch components of the first and second rotation values are zero. Wohlfeld discloses the controller 468 is configured to determine a first translation value, a second translation value, along with first and second rotation values (yaw, roll, pitch) that, when applied to a combination of the first 2D scan data and second 2D scan data, results in transformed first 2D data that closely matches transformed second 2D data according to an objective mathematical criterion (paragraph 122). Although Wohlfeld stores rotation values in addition to the 2d translation values, the transformation provided by the 2D scan data is still limited to a horizontal plane. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-12 and 14-19 are rejected under 35 U.S.C. 103 as being unpatentable over Yoon et al. US Publication 2025/0110503 (hereafter “Yoon”), Huber US Publication 2021/0027477 (“Huber”) and Wohlfeld et al. US Publication 2021/0183081 (hereafter “Wohlfeld”). Referring to claims 1 and 16, Yoon discloses a mobile three-dimensional (3D) measuring system, comprising: a 3D measuring device configured to capture 3D data in a multi-level architecture (paragraph 262, 264, As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000. As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000); a sensor configured to estimate an altitude of the 3D measuring device (paragraph 145, The monitoring screen 1400 is a screen that may monitor the plurality of robots R positioned within the building 1000, which includes a plurality of floors); and one or more processing units coupled with the 3D measuring device and the orientation sensor, the one or more processing units configured to perform a method comprising: receiving a first portion of the 3D data captured by the 3D measuring device (paragraph 262, 264, As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000. As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000); determining a level index based on the altitude estimated by the sensor, the level index indicates a level of the multi-level architecture at which the first portion of the 3D data is captured (paragraph 82, As described above, in the building according to some example embodiments, it is possible to extract and monitor positions of the robots using various infrastructures provided in the building); associating the level index with the first portion (paragraph 156, The map list 1500 may include items 1510, 1520, and 1530, each corresponding to at least one of a plurality of floors); and generating a map of the multi-level architecture using the first portion, the generating comprises registering the first portion with a second portion of the 3D data responsive to the level index of the first portion being equal to the level index of the second portion (paragraph 339, As described above, in some example embodiments, not only may the specific map 1700 be generated using the sensing information sensed by the robot R while traveling within the building 1000 and sensing the space 10 and the spatial meta information on the space 10, but a user interface may also be provided that allows the user to generate the specific map 1700). While Yoon discloses determining an altitude of the 3D measuring device, Yoon does not disclose expressly a sensor configured to estimate an altitude of the 3D measuring device Huber discloses an orientation sensor configured to estimate an altitude of the 3D measuring device (paragraph 47, In some embodiments, one may utilize barometric pressure sensors/altimeter to establish scanning height and subsequent floor height). At the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to incorp an altimeter into the robot’s sensors. The motivation for doing so would have been to allow the robot to determine scanning height in order to better determine its location. Yoon discloses registering the first portion with the second portion, but does not disclose expressly compensating for a drifting error. Wohlfeld discloses wherein registering the first portion with the second portion comprises: determining a transformation to be applied to the first portion to compensate for a drifting error (paragraph 160, the map is used as a reference, so that an iterative closest point (ICP) algorithm can detect and correct the drifting parts of the point cloud); limiting the transformation to only a horizontal plane (paragraph 122, the controller 468 is configured to determine a first translation value, a second translation value, along with first and second rotation values (yaw, roll, pitch) that, when applied to a combination of the first 2D scan data and second 2D scan data, results in transformed first 2D data that closely matches transformed second 2D data according to an objective mathematical criterion); and transforming the first portion based on the transformation that is limited (paragraph 160, the map is used as a reference, so that an iterative closest point (ICP) algorithm can detect and correct the drifting parts of the point cloud). At the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to compensate for drifting error. The motivation for doing so would have been to reduce inaccuracies in which the captured image does not reflect the actual environment. Therefore, it would have been obvious to combine Huber and Wohlfeld with Yoon to obtain the invention as specified in claims 1 and 16. Referring to claims 2 and 17, Yoon discloses the 3D measuring device, but does not disclose expressly wherein the 3D measuring device comprises a LIDAR sensor. Wohlfeld discloses wherein the 3D measuring device comprises a LIDAR sensor to capture a digital representation of the multi-level architecture as the 3D measuring system is transported in the multi-level architecture (paragraph 85, In an embodiment, the 3D scanner 310 is a time-of-flight (TOF) laser scanner such as that shown and described in reference to FIGS. 27-29). At the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to use a LIDAR sensor. The motivation for doing so would have been to incorporate an efficient sensor that is capable of accurately measure surrounding objects. Therefore, it would have been obvious to combine Wohlfeld with Yoon to obtain the invention as specified in claims 2 and 17. Referring to claim 3, Yoon discloses wherein the 3D measuring device continuously transmits a captured data to a computing system as the 3D measuring device is moved in the multi-level architecture, the computing system comprising the one or more processing units (paragraph 262, As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000). Referring to claim 4, Yoon discloses wherein the computing system generates a 3D point cloud representing the multi-level architecture based on the captured data and stores the 3D point cloud (paragraph 85, Accordingly, some example embodiments provide a method of generating a specific map for a specific floor by accurately reflecting the characteristics and situations of spaces within the building 1000, a method of allocating nodes to the specific map 1700, and a method of providing a user environment therefor). Referring to claim 5, Yoon discloses wherein the 3D measuring device is configured for wireless communication with the computing system (paragraph 372, Communication performed between devices of some example embodiments (e.g., the control unit 150, the server 20, the cloud server 21, the edge server 22, the robot R, the building system 1000a, the map generation system 3000, the control unit 330 and/or the electronic device 50) may be performed via wired and/or wireless communication). Referring to claims 6 and 18, Yoon discloses determining an altitude of the 3D measuring device, and Huber discloses an orientation sensor (paragraph 47, In some embodiments, one may utilize barometric pressure sensors/altimeter to establish scanning height and subsequent floor height). Yoon and Huber do not disclose expressly wherein the orientation sensor comprises a gyroscope, an accelerometer, and magnetometer. Wohlfeld discloses wherein the orientation sensor comprises a gyroscope, an accelerometer, and magnetometer (paragraph 103, The IMU 474 is a position/orientation sensor that may include accelerometers 494 (inclinometers), gyroscopes 496, a magnetometers or compass 498, and altimeters). At the time of the effective filing date of the claimed invention, it would have obvious to a person of ordinary skill in the art to use a gyroscope, an accelerometer, and magnetometer. The motivation for doing so would have been to incorporate efficient sensors that are capable of accurately determining position and orientation in order to better monitor a device’s status. Therefore, it would have been obvious to combine Wohlfeld with Yoon and Huber to obtain the invention as specified in claims 6 and 18. Referring to claims 7 and 19, Yoon discloses wherein determining the level index for the first portion comprises: monitoring the altitude estimated by the sensor (paragraph 145, The monitoring screen 1400 is a screen that may monitor the plurality of robots R positioned within the building 1000, which includes a plurality of floors); incrementing a previous level index in response to the altitude estimated by the sensor increasing at least by a predetermined threshold; and decrementing the previous level index in response to the altitude estimated by the sensor decreasing at least by the predetermined threshold (paragraph 150, Here, “state graphic object 1420” may be understood as a graphic object configured with a visual exterior appearance corresponding to state information, so that the state information on the robots R positioned on each of the plurality of floors within the building 1000 is displayed) (paragraph 72, These facility infrastructures 1 and 2 may support vertical movement of the robot R between different floors of the building 1000). Huber discloses monitoring the altitude estimated by the orientation sensor (paragraph 47, In some embodiments, one may utilize barometric pressure sensors/altimeter to establish scanning height and subsequent floor height). Referring to claim 8, Yoon discloses wherein the previous level index is an initial level index that is configured prior to scanning the multi-level architecture and according to the level at which the scanning is initiated (paragraph 163, The specific map 1700 may be stored in the storage unit 320 along with an editing history for the specific map). Referring to claim 9, Yoon discloses wherein determining the level index for the first portion comprises an operator entering the level index (paragraph 258, In some example embodiments, the spatial characteristic information matched to the specific map 1700 may be specified on the basis of the characteristics of the space 10 corresponding to the specific floor being determined by the control unit 330 during the process of generating the specific map 1700, or may be specified on the basis of information input by the administrator of the system 3000). Referring to claim 10, Yoon discloses wherein the level index is associated with the first portion in response to the level index being changed to a second level index (paragraph 156, In one area of the items 1510, 1520, and 1530, there may be included function icons (e.g., “map generation function icon” or “map editing function icon” 1510a, 1530b) related to functions that receive a map editing request input for the floor corresponding to each item). Referring to claim 11, Yoon discloses wherein the level index is associated with the first portion captured by the 3D measuring device in a continuous manner (paragraph 262, 264, As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000. As illustrated in (a) of FIG. 21A, the robot R may travel within the building 1000 while performing sensing or scanning of the space inside the building 1000). Referring to claim 12, Yoon discloses wherein associating the level index with the first portion comprises storing the level index in a metadata of a digital representation of the first portion (paragraph 156, The map list 1500 may include items 1510, 1520, and 1530, each corresponding to at least one of a plurality of floors). Referring to claim 14, Wohlfeld discloses wherein the transformation is determined based on one or more constraints (paragraph 154, When they are available, local reference systems such as spheres or points or checkers can be used as reference points by contemporary systems to reduce or minimize the drift). Referring to claim 15, Wohlfeld discloses wherein the one or more constraints are based on one or more corresponding features of the first portion and the second portion (paragraph 154, When they are available, local reference systems such as spheres or points or checkers can be used as reference points by contemporary systems to reduce or minimize the drift). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER K HUNTSINGER whose telephone number is (571)272-7435. The examiner can normally be reached Monday - Friday 8:30 - 5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benny Q Tieu can be reached at 571-272-7490. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PETER K HUNTSINGER/Primary Examiner, Art Unit 2682
Read full office action

Prosecution Timeline

Jul 21, 2023
Application Filed
Nov 05, 2025
Non-Final Rejection — §103
Jan 22, 2026
Response Filed
Feb 12, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12540884
Determining Fracture Roughness from a Core
2y 5m to grant Granted Feb 03, 2026
Patent 12412381
METHODS AND SYSTEMS FOR CONTROLLING OPERATION OF WIRELINE CABLE SPOOLING EQUIPMENT
2y 5m to grant Granted Sep 09, 2025
Patent 12387360
APPARATUS AND METHOD FOR ESTIMATING UNCERTAINTY OF IMAGE COORDINATE
2y 5m to grant Granted Aug 12, 2025
Patent 12388943
PRINTING SYSTEM USING FLUORESENT AND NON-FLUORESENT INK, PRINTING APPARATUS, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND CONTROL METHOD THEREOF
2y 5m to grant Granted Aug 12, 2025
Patent 12374081
DIGITAL IMAGE PROCESSING TECHNIQUES USING BOUNDING BOX PRECISION MODELS
2y 5m to grant Granted Jul 29, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
28%
Grant Probability
45%
With Interview (+16.7%)
4y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 322 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month