Prosecution Insights
Last updated: April 19, 2026
Application No. 18/191,757

MEASURING SYSTEM FOR A CONSTRUCTION AND WORK MACHINE

Final Rejection §103
Filed
Mar 28, 2023
Examiner
RAYNAL, ASHLEY BROWN
Art Unit
3648
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Moba Mobile Automation AG
OA Round
2 (Final)
78%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
28 granted / 36 resolved
+25.8% vs TC avg
Strong +23% interview lift
Without
With
+22.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
33 currently pending
Career history
69
Total Applications
across all art units

Statute-Specific Performance

§101
7.5%
-32.5% vs TC avg
§103
48.4%
+8.4% vs TC avg
§102
19.6%
-20.4% vs TC avg
§112
24.6%
-15.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 36 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The following is a final office action in response to the communication filed on 11/05/2025. Claims 1, 11, 15 and 17 have been amended. Claims 18 and 19 have been added. Claims 1-19 are currently pending and have been examined. Response to Arguments Applicant’s arguments and remarks filed on 11/05/2025 have been fully considered. Applicant’s amendments do not overcome the objection to claim 11 because the amended claim still contains bullet points, see claim objections below. Applicant’s amendments overcome the U.S.C. §112(b) rejection of claim 17. Applicant’s arguments provided for the U.S.C. §103 rejections of claims 1-17 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Objections Claim 1 is objected to because of the following informalities: in the final line, “a group comprising joint and/or virtual pivot point” should read “a group comprising a joint and/or a virtual pivot point”. Appropriate correction is required. Claim 11 is objected to because of the following informalities: the claims include the use of hyphens as bullet points to mark a plurality of steps. As stated in MPEP § 608.04(m), “Where a claim sets forth a plurality of elements or steps, each element or step of the claim should be separated by a line indentation, 37 CFR 1.75(i)”. Examiner recommends removing both hyphens and line indentations as only two items remain in the list. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-5 and 8-17 are rejected under 35 U.S.C. 103 as being unpatentable over Vesanen et al. (EP-3730702-A1; hereinafter Vesanen) in view of Wells et al. (US-12354219-B1; hereinafter Wells). Regarding claim 1, Vesanen discloses [Note: what Vesanen fails to disclose is strike-through] A calibration system for calibrating (see at least [0001]; “The present invention relates to a measuring arrangement relating, for example, to earthworks machines or lifting machines, which measuring arrangement may, for example, be utilized in an individual calibration of each machine.”) a component of a construction machine (see at least [0004]; “A problem relating to the automatic positioning of the work machine and its working tool is, however, variations in the measures or dimensions of the work machines.”), in particular an excavator (see at least [0002]; “Different types of work machines may be utilized at different earth-moving work sites or construction sites for example for moving soil or rock material to another location or to lift materials to be used in the constructions… The work machines like that are for example excavators and mobile cranes.”), a bulldozer, a grader, a drill rig, a pile driver or a diaphragm wall cutter, wherein the component comprises at least one degree of freedom (see at least [0004]; “For example, as regards to excavators wherein there is an upper carriage rotatable relative to a lower carriage, it is very difficult to take into account for example in a positioning of a tip of a bucket a position of a rotation axis of the upper carriage relative to a boom pin that fastens a boom of the excavator to the upper carriage of the excavator.”), comprising: a (see at least [0043]; “The locator may also be or comprise at least one tachymeter, at least one theodolite or at least one laser scanning device.”), wherein the (see at least [0044]; “The positioning arrangement further comprises at least one first spot to be located. The positioning arrangement thus comprises one first spot or two or more first spots to be located. The feature the first spot refers to a specific point in the machine which can be preferably individually identified in the machine.”) of the component and/or the construction machine (see at least [0044]; “The at least one first spot may thus be one or more selected points at the carriage 2 of the excavator 1 the position(s) of which is/are to be determined during the carrying out of the measuring procedure.”) to determine position information for the plurality of measurement points of the component and/or the construction machine (see at least [0058]; “The method for measuring a three dimensional location and orientation of the center axis of a first axle in relation to the center axis of a second axle comprises attaching at least one first spot to be located by at least one locator and to be rotatable around the first axle; attaching at least one second spot to be located by at least one locator and to be rotatable around the second axle; measuring by the at least one locator a first set of at least three different position data measurements of each of the at least one first spot…”); a processor configured to determine a 3D model of the component and/or the construction machine based on the position information for the plurality of measurement points (see at least [0060] – [0067], where the processing unit derives the 3D model shown in Figs. 5a – 5c based on the position of the measured spots); wherein the plurality of measurement points are formed by one or more specific points of the component and/or the construction machine (see at least [0045]; “The positioning arrangement further comprises at least one second spot to be located. The positioning arrangement thus comprises one second spot or two or more second spots to be located. The feature the second spot refers to a specific point in the machine which can be preferably individually identified in the machine. The at least one second spot may thus be one or more selected points at the boom 5 of the excavator 1 the position(s) of which is/are to be determined during the carrying out of the measuring procedure. The at least one second spot may for example be a tag 21 or an antenna 16 fixed at a specific point in the first boom part 5a of the boom 5. The number of the tags 21 may be higher than only one.”) out of a group comprising a joint (see at least Fig. 2, where cameras 23 image the entire exterior of the machine, including joints such as boom pin 8. Examiner considers any point of the machine captured in an image to be a measurement point that together form a group of measurement points.); and/or a virtual pivot point. However, Vesanen does not explicitly teach that the locator device performing the measurements is mobile, nor does Vesanen explicitly disclose the use of LiDAR. Vesanen discloses an arrangement for taking depth measurements to calculate the relative location of axles in a work machine, and Wells is directed generating measurable 3D models using camera and LIDAR scans. Wells teaches: A mobile device comprising a LiDAR sensor (see at least Fig. 1, LIDAR instrument 100), wherein the LiDAR sensor is configured to detect a plurality of measurement points (see at least col. 6, lines 2-9; “The LIDAR sensor emits pulsed light waves into the surrounding environment. These pulses bounce off surrounding objects and are returned to the sensor. The sensor(s) use the time it takes for each pulse to return to the sensor to calculate the precise distance it traveled. This process is repeated millions of times per second to create a precise real-time three-dimensional map of the environment from that particular location.”) of the component and/or the construction machine (see at least col. 7, lines 9-15; “The details described below may relate to a carwash facility and the elements included within the car wash facility, but it should be known that the elements and processes may be used in any number of other facilities such as distribution centers or manufacturing facilities such as vehicle manufacturing lines/systems or farming equipment lines/systems.”) to determine position information for the plurality of measurement points of the component and/or the construction machine (see at least col. 6, lines 7-9; “This process is repeated millions of times per second to create a precise real-time three-dimensional map of the environment from that particular location.”); a processor (see at least col. 4, lines 24-26; “Of course, a single computer system may be used to process the images received using the systems of the present disclosure.”) configured to determine a 3D model of the component and/or the construction machine based on the position information for the plurality of measurement points (see at least Fig. 2 and col. 6, lines 9-17; “The onboard computer to the LIDAR system or a remotely located system is further used to analyze the resultant images. The output from each of the camera and the LIDAR instrument are then typically combined to form a remotely viewable and measurable output. Mostly typically visual images and LIDAR images are taken at a variety of locations shown as a dot on the image in FIG. 2.”); wherein the plurality of measurement points are formed by one or more specific points of the component and/or the construction machine (see at least col. 8, lines 30-34; “Each of the “dots” 116 in FIG. 2 represent a location where one or more scans of either or both of a LIDAR and/or visual photograph or video image was taken and processed into the image for later use and analysis.”) Vesanen uses a stereo camera or laser scanner to extract 3D measurements of specific points on a work machine, and Wells uses a camera and LIDAR to extract 3D measurements of specific points in an industrial facility. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the camera and LIDAR sensor of Wells in the context of a construction machine, as taught by Vesanen. Such a modification would have a reasonable expectation of success because both Vesanen and Wells use cameras and laser-based technologies to measure specific points in three-dimensional space. Using the sensor of Wells in the context of Vesanen would represent simple substitution of one known element for another to obtain predictable results. Regarding claim 2, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: wherein the position information comprises distance information starting from the LiDAR sensor; and/or wherein the position information comprises 3D position information in space (see at least [0078]; “According to an embodiment the measuring arrangement is further configured to determine at least one 3D-point of the earthworks machine in three dimensions with respect to a determined point in the second axle, and the position data gathering unit is further configured to gather a third set of at least one position data measurement of the at least one 3D-point to be located with respect to at least one of the at least one set of the position data measurements and the information rendered from the at least one set of the position data measurements.”); and/or wherein the position information comprises 3D position information in a coordinate system defined by the LiDAR sensor. Regarding claim 3, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: wherein the 3D model comprises depth information; and/or wherein a 3D position of each component is determined by at least two measurement points of the component (see at least [0061]; “Secondly, further referring to Figure 5a, the processing unit is configured to define a second plane P2' based on each different position measurements in the second set of position data measurements. The second set of the position data measurements of Figure 5a comprises three position data measurements M21, M22, M23 of one second spot, i.e. a specific point in the boom 5, such as the tag 21 in the first boom part 5a, each position data measurement M21, M22, M23 being carried out at different angle of the rotation of the boom 5 about the rotation axis 9 of the boom pin 8 as shown schematically in Figure 1. The position data measurements M21, M22, M23 may be carried out by the positioning arrangement of Figure 4 including the stereo camera arrangement 22 shown schematically in Figure 2. The second plane P2' is a plane which is determined by the second set of the position data measurements M21, M22, M23 forming three vectors between the position data measurements M21, M22 and M23, rendering vectors M21 to M22, M22 to M23 and M23 to M21, for example. These vectors define plane P2' that is perpendicular to the centre axis of the second axle, i.e. the boom pin 8.”). Regarding claim 4, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: wherein the LiDAR sensor is configured to detect the measurement points in a plurality of orientations of the LiDAR sensor to the component and/or construction machine; and/or wherein detecting the plurality of measurement points is performed in one pose of the component and/or the construction machine; or wherein detecting the measurement points is performed in a plurality of poses of the component and/or the construction machine (see at least [0061]; “Secondly, further referring to Figure 5a, the processing unit is configured to define a second plane P2' based on each different position measurements in the second set of position data measurements. The second set of the position data measurements of Figure 5a comprises three position data measurements M21, M22, M23 of one second spot, i.e. a specific point in the boom 5, such as the tag 21 in the first boom part 5a, each position data measurement M21, M22, M23 being carried out at different angle of the rotation of the boom 5 about the rotation axis 9 of the boom pin 8 as shown schematically in Figure 1.”). Regarding claim 5, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: wherein the LiDAR sensor is configured to determine position information for the plurality of measurement points of a plurality of components comprising a plurality of degrees of freedom; and/or wherein the processor is configured to determine the 3D model comprising the plurality of components (see at least [0059]; “According to an embodiment the three dimensional location and orientation of the centre axis of the first axle, i.e. the rotation axis 4 of the rotation axle 3 of the upper carriage 2b, with respect to the centre axis of the second axle, i.e. the rotation axis 9 of the boom pin 8, may be determined with the following procedure, referring especially to Figures 5a to 5c but also to Figures 1, 2, 3a to 3c and 4. The procedure utilizes vector analysis and is carried out by the processing or calculation unit, such as the control unit 14.”). Regarding claim 8, Vesanen in view of Wells teaches the calibration system according to claim 1. Wells further teaches: wherein the LiDAR sensor is configured to determine a point cloud for the plurality of measurement points (see at least col. 1, lines 35-44; “One aspect of the present disclosure includes a method of creating a three-dimensional, measurable, virtual model of a facility including taking first and second 360-degree scans at different locations of the facility using both a visual camera and a LIDAR instrument, saving the first and second 360-degree scans on a data storage unit of a portable computing device, creating first and second images and a first point cloud and a second point clouds on the portable computing device from the first and second 360-degree scans and coupling point clouds with the images.”). It would have been obvious to combine Vesanen and Wells for the reasons given regarding claim 1. Regarding claim 9, Vesanen in view of Wells teaches the calibration system according to claim 1. Wells further teaches: wherein the LiDAR sensor is part of a LiDAR scanner (see at least col. 2, lines 9-23; “Yet another aspect of the present disclosure is generally directed to a method of analyzing and planning a remodeling, a reconstruction, a construction of a facility, surveying a facility for repair or replacement of a party or safety concerns, or planning a replacement of a piece of machinery associated with a facility (the facility typically being a vehicle cleaning facility and the machinery typically being vehicle cleaning machinery and systems) that includes the steps of: conducting LIDAR scans and photographic imaging at a plurality of locations within a facility using a LIDAR scanner and photographic imaging device wherein the plurality of locations are of a sufficient number to scan and provide imaging of an entire facility with the optional exception of any office space within the facility…”); and/or wherein the LiDAR sensor is part of a LiDAR scanner configured to emit light in correspondence with a dot grid. It would have been obvious to combine Vesanen and Wells for the reasons given regarding claim 1. Regarding claim 10, Vesanen in view of Wells teaches the calibration system according to claim 1. Wells further teaches: wherein the LiDAR sensor is configured to perform a distance measurement based on a light reflection and/or a time-of-flight measurement of a light reflection (see at least col. 6, lines 2-9; “The LIDAR sensor emits pulsed light waves into the surrounding environment. These pulses bounce off surrounding objects and are returned to the sensor. The sensor(s) use the time it takes for each pulse to return to the sensor to calculate the precise distance it traveled. This process is repeated millions of times per second to create a precise real-time three-dimensional map of the environment from that particular location.”). It would have been obvious to combine Vesanen and Wells for the reasons given regarding claim 1. Regarding claim 11, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: wherein the marked, color-coded and/or raised measurement points (see at least [0026]; “According to an embodiment of the measuring arrangement the at least one first spot and the at least one second spot are at least one of: tags, prisms and antennas to be located by at least one of the at least one locator, wherein the at least one locator is at least one of: at least one stereo camera arrangement, at least one tachymeter, at least one theodolite, at least one of laser scanning device, a satellite-based positioning system GNSS and any network where location based services using triangulation is possible.”) arranged on the construction machine or component (see at least [0009]; “According to an idea of the solution at least one first spot and at least one second spot to be located are selected in the work machine.”); wherein the plurality of measurement points are formed by specific points of the component and/or the construction machine from a group, the group comprising - fixed point (see at least Fig. 1; tag 21 is at a fixed point on the boom) and/or - contact point of the tool. However, Vesanen does not explicitly teach the use of LiDAR. Wells teaches a LiDAR sensor (see at least col. 6, lines 2-9; “The LIDAR sensor emits pulsed light waves into the surrounding environment. These pulses bounce off surrounding objects and are returned to the sensor. The sensor(s) use the time it takes for each pulse to return to the sensor to calculate the precise distance it traveled. This process is repeated millions of times per second to create a precise real-time three-dimensional map of the environment from that particular location.”). It would have been obvious to combine Vesanen and Wells for the reasons given regarding claim 1. Regarding claim 12, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: the calibration system comprising an interface for wireless communication with a machine controller (see at least [0047]; “The position data gathering unit referred above may for example be connected to a stereo camera arrangement 22 via a data communication connection, either wireless or wired, and may reside in the control unit 14. The stereo camera arrangement 22 comprises at least two cameras 23 that are able to determine a 3D position of the tags to be monitored by triangulation when the location and orientation of the at least two cameras 23 in relation to each other are known.” See also [0038]; “The excavator 1 further comprises at least one control unit 14 which is configured to control, in response to received control actions, operations of the excavator 1, such as operations of the carriage 2, the boom 5 and the bucket 11.”). Regarding claim 13, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: the calibration system comprising a machine controller (see at least [0038]; “The excavator 1 further comprises at least one control unit 14 which is configured to control, in response to received control actions, operations of the excavator 1, such as operations of the carriage 2, the boom 5 and the bucket 11.”) and/or a machine display, wherein the machine display is configured to calculate and/or display a position and/or positionings of the component based on one or more sensor data for monitoring one or more degrees of freedom while considering the 3D model; wherein the machine controller is configured to calculate a position and/or positioning of the component based on one or more sensor data for monitoring one or more degrees of freedom while considering the 3D model (see at least [0048]; “Referring to the example disclosed in Figures 1, 2 and 3a to 3c above and in Figures 5a to 5c later, the processing unit may for example be the control unit 14, or reside in the control unit 14, of the excavator 1, whereby the control unit 14 is configured to receive the position data referring to the head of the antenna 16 and the tag 21, and on the basis thereof, to determine the three dimensional location and orientation of the centre axis of the first axle, i.e. the rotation axis 4 of the rotation axle 3 of the upper carriage 2b, with respect to the centre axis of the second axle, i.e. the lifting axis 9 or the rotation axis 9 of the boom pin 8.” Examiner notes that Figs. 5a-5c show 3D models.). Regarding claim 14, Vesanen in view of Wells teaches the calibration system according to claim 1. Vesanen further teaches: A construction machine, in particular an excavator (see at least Fig. 1, excavator 1), bulldozer, grader, drilling rig, pile driver or diaphragm wall cutter, comprising a calibration system (see at least [0040]; “The measuring arrangement disclosed next is intended to eliminate the effect of any variation in the measure between the rotation axis 4 of the upper carriage 2b and the rotation axis 9 of the boom 5, and in that sense to calibrate the control of the excavator 1.”), and a machine controller (see at least [0038]; “The excavator 1 further comprises at least one control unit 14 which is configured to control, in response to received control actions, operations of the excavator 1, such as operations of the carriage 2, the boom 5 and the bucket 11.”). Regarding claim 18, Vesanen in view of Wells teaches the calibration system of claim 1. Wells further teaches: wherein the camera of the mobile device is used to select 2D points in an image, and the processor projects corresponding LiDAR sensor values to determine 3D positions of the selected measurement points (see at least col. 10, line 63 – col. 11, line5; “The remote user may use a “virtual measuring tape” to measure distances using the combined LIDAR point cloud location data and the visual images as shown in FIG. 6. As discussed above, this can be done accurately by the user selecting a starting location and subsequently selecting the ending location for measurement. The data may also be used to present 2-dimensional and 3-dimensional measurements to allow the user to access square footage and volume information about different spaces.”). It would have been obvious to combine Vesanen and Wells for the reasons given regarding claim 1. Regarding claim 19, Vesanen in view of Wells teaches the calibration system of claim 1. Vesanen further teaches: wherein at least two measurement points per component are used to determine the 3D orientation of the component in space (see at least [0061]; “Secondly, further referring to Figure 5a, the processing unit is configured to define a second plane P2' based on each different position measurements in the second set of position data measurements. The second set of the position data measurements of Figure 5a comprises three position data measurements M21, M22, M23 of one second spot, i.e. a specific point in the boom 5, such as the tag 21 in the first boom part 5a, each position data measurement M21, M22, M23 being carried out at different angle of the rotation of the boom 5 about the rotation axis 9 of the boom pin 8 as shown schematically in Figure 1. The position data measurements M21, M22, M23 may be carried out by the positioning arrangement of Figure 4 including the stereo camera arrangement 22 shown schematically in Figure 2. The second plane P2' is a plane which is determined by the second set of the position data measurements M21, M22, M23 forming three vectors between the position data measurements M21, M22 and M23, rendering vectors M21 to M22, M22 to M23 and M23 to M21, for example. These vectors define plane P2' that is perpendicular to the centre axis of the second axle, i.e. the boom pin 8.”). Claims 6-7 are rejected under 35 U.S.C. 103 as being unpatentable over Vesanen in view of Wells, further in view of Yan et al. (US-11210863-B1; hereinafter Yan). Regarding claim 6, Vesanen in view of Wells teaches the calibration system according to claim 1. However, Vesanen does not teach: wherein the mobile device is formed by a smart device, smartphone, or tablet PC. Vesanen uses a locator device comprising a laser scanner to measure the position of specific points on a work machine, and Yan is directed to real-time object placement guidance in augmented reality experience, in particular by sensing an object in the physical environment. Yan teaches: wherein the mobile device is formed by a smart device, smartphone, or tablet PC (see at least col. 25, lines 12-33; “FIG. 5 illustrates a block diagram of an example of a computing element 500 (for example, the mobile device 102, mobile device 202, mobile device 302, one or more devices 404, online retail store 428, or any other element described herein that may be used to perform processing of any sort) or system upon which any one or more of the techniques (e.g., methodologies) discussed herein may be performed… The computing element 500 may be a server, a personal computer (PC), a smart home device, a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a wearable computer device, a web appliance, a network router, a switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine, such as a base station.”). Vesanen uses a locator device comprising a laser scanner to measure the position of specific points on a work machine, and Yan uses a mobile device comprising LiDAR (see col. 3, line 42) to create a virtual representation of an object. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the mobile device of Yan to perform the functions of the locator device of Vesanen. Such a modification would have a reasonable expectation of success because both devices use laser-based technologies to map points in three-dimensional space. One of ordinary skill would be motivated to use the mobile device taught by Yan for the measurements of Vesanen due to the convenience gained by the portability of a mobile device. Regarding claim 7, Vesanen in view of Wells teaches the calibration system according to claim 1. However, Vesanen does not explicitly teach: wherein detecting comprises recording the component and/or the construction machine by means of a camera in the mobile device; and/or wherein the mobile device comprises a human-machine interface via which one or more measurement points can be defined and/or marked by a user; and/or wherein the mobile device comprises a human-machine interface configured to provide an indication to a user regarding the orientation of the LiDAR sensor. Yan teaches: wherein detecting comprises recording the component and/or the construction machine by means of a camera in the mobile device (see at least col. 3, lines 32-43; “The pre-processing phase may begin by an application including an augmented reality module (also referred to herein as system) accessing a camera of the user's mobile device so that it may generate an user interface that may display a real-time view captured by the camera to the user through the display. It should be noted that although reference may be made to a camera of the device herein, the device may use any other sensor to capture data used for any of the purposes herein as well. For example, the mobile device may use a LIDAR sensor to capture data about the environment instead of, or in addition to, a camera of the mobile device.”); and/or wherein the mobile device comprises a human-machine interface via which one or more measurement points can be defined and/or marked by a user; and/or wherein the mobile device comprises a human-machine interface configured to provide an indication to a user regarding the orientation of the LiDAR sensor. Vesanen uses a locator device comprising a laser scanner to measure the position of specific points on a work machine, and Yan uses a mobile device comprising LiDAR (see col. 3, line 42) to create a virtual representation of an object. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the mobile device of Yan to perform the functions of the locator device of Vesanen. Such a modification would have a reasonable expectation of success because both devices use laser-based technologies to map points in three-dimensional space. One of ordinary skill would be motivated to use the mobile device taught by Yan for the measurements of Vesanen due to the convenience of the real-time display available through the device of Yan (see Yan at least col. 3, line 37). Claims 15-17 are rejected under 35 U.S.C. 103 as being unpatentable over Vesanen et al. (EP-3730702-A1; hereinafter Vesanen) in view of Kaiser et al. (US-20230274040-A1; hereinafter Kaiser). Regarding claim 15, Vesanen teaches: A method for calibrating (see at least [0001]; “The present invention relates to a measuring arrangement relating, for example, to earthworks machines or lifting machines, which measuring arrangement may, for example, be utilized in an individual calibration of each machine.”) a component of a construction machine (see at least [0004]; “A problem relating to the automatic positioning of the work machine and its working tool is, however, variations in the measures or dimensions of the work machines.”), in particular an excavator (see at least [0002]; “Different types of work machines may be utilized at different earth-moving work sites or construction sites for example for moving soil or rock material to another location or to lift materials to be used in the constructions… The work machines like that are for example excavators and mobile cranes.”), in particular an excavator (see at least [0002]; “Different types of work machines may be utilized at different earth-moving work sites or construction sites for example for moving soil or rock material to another location or to lift materials to be used in the constructions… The work machines like that are for example excavators and mobile cranes.”), a bulldozer, a grader, a drill rig, a pile driver or a diaphragm wall cutter, wherein the component comprises at least one degree of freedom (see at least [0004]; “For example, as regards to excavators wherein there is an upper carriage rotatable relative to a lower carriage, it is very difficult to take into account for example in a positioning of a tip of a bucket a position of a rotation axis of the upper carriage relative to a boom pin that fastens a boom of the excavator to the upper carriage of the excavator.”), comprising: detecting, by a (see at least [0043]; “The locator may also be or comprise at least one tachymeter, at least one theodolite or at least one laser scanning device.”), a plurality of measurement points (see at least [0044]; “The positioning arrangement further comprises at least one first spot to be located. The positioning arrangement thus comprises one first spot or two or more first spots to be located. The feature the first spot refers to a specific point in the machine which can be preferably individually identified in the machine.”) of the component and/or the construction machine (see at least [0044]; “The at least one first spot may thus be one or more selected points at the carriage 2 of the excavator 1 the position(s) of which is/are to be determined during the carrying out of the measuring procedure.”) to determine position information for the plurality of measurement points of the component and/or the construction machine (see at least [0058]; “The method for measuring a three dimensional location and orientation of the center axis of a first axle in relation to the center axis of a second axle comprises attaching at least one first spot to be located by at least one locator and to be rotatable around the first axle; attaching at least one second spot to be located by at least one locator and to be rotatable around the second axle; measuring by the at least one locator a first set of at least three different position data measurements of each of the at least one first spot…”); and determining a 3D model of the component and/or the construction machine based on the position information for the plurality of measurement points (see at least [0060] – [0067], where the processing unit derives the 3D model shown in Figs. 5a – 5c based on the position of the measured spots). However, Vesanen does not explicitly teach that the locator device performing the measurements is mobile, nor does Vesanen explicitly disclose the use of LiDAR. Vesanen discloses an arrangement for measuring the relative location of axles in a work machine, and Kaiser is directed to 3D modeling of shapes. Kaiser teaches: detecting, by a LiDAR sensor of a mobile device (see at least [0017]; “Aspects and features of the present disclosure provide 3D modeling software that can be used to edit 3D models represented using differentiable, signed distance functions to maintain editability while displaying changes in a manner that is computing resource efficient and fast. Further, such an application can automatically create such an editable 3D model as a starting point using a reference representation that can be readily obtained and stored in a variety of formats. The reference representation can be obtained, as an example, by scanning a physical object using the light detection and ranging (LiDAR) feature of a mobile device.”), a plurality of measurement points to determine position information for the plurality of measurement points (see at least [0024]; “The LiDAR data is a cloud of individual points reflected from everything on any surface in the scene (including walls, woodwork, etc.), however, the sofa in this example is the object of interest. Some parts of the sofa are not visible to the scanner, resulting in LiDAR data that is incomplete with respect to the physical surfaces of the reference object. Nonetheless, the 3D modeling application 102 transforms the reference representation into a common representation by sampling the available points associated with the reference representation of the object and selecting candidate procedural models corresponding to the reference representation based on the common representation.”). Vesanen uses a locator device comprising a laser scanner to measure the position of specific points on a work machine, and Kaiser uses LiDAR to create a reference representation for use in 3D models. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the mobile device of Kaiser to perform the functions of the locator device of Vesanen. Such a modification would have a reasonable expectation of success because both devices use laser-based technologies to map points in three-dimensional space. One of ordinary skill would be motivated to use the mobile device taught by Kaiser for the measurements of Vesanen because the device of Kaiser produces a reference representation suitable for use in 3D models, as taught by Kaiser (see Kaiser at least [0019]; “For example, a LiDAR scanner can create a reference representation from an actual physical object that is similar to the desired model.”). Regarding claim 16, Vesanen in view of Kaiser teaches the method of claim 15. Vesanen further teaches: wherein detecting comprises recording the component and/or the construction machine by means of a camera of the mobile device; and/or wherein the method comprising calculating and/or displaying a position and/or positioning of the component based on one or more sensor data for monitoring one or more degrees of freedom while considering the 3D model (see at least [0048]; “Referring to the example disclosed in Figures 1, 2 and 3a to 3c above and in Figures 5a to 5c later, the processing unit may for example be the control unit 14, or reside in the control unit 14, of the excavator 1, whereby the control unit 14 is configured to receive the position data referring to the head of the antenna 16 and the tag 21, and on the basis thereof, to determine the three dimensional location and orientation of the centre axis of the first axle, i.e. the rotation axis 4 of the rotation axle 3 of the upper carriage 2b, with respect to the centre axis of the second axle, i.e. the lifting axis 9 or the rotation axis 9 of the boom pin 8.” Examiner notes that Figs. 5a-5c show 3D models.). Regarding claim 17, Vesanen teaches: (see at least [0001]; “The present invention relates to a measuring arrangement relating, for example, to earthworks machines or lifting machines, which measuring arrangement may, for example, be utilized in an individual calibration of each machine.”) a component of a construction machine (see at least [0004]; “A problem relating to the automatic positioning of the work machine and its working tool is, however, variations in the measures or dimensions of the work machines.”), in particular an excavator (see at least [0002]; “Different types of work machines may be utilized at different earth-moving work sites or construction sites for example for moving soil or rock material to another location or to lift materials to be used in the constructions… The work machines like that are for example excavators and mobile cranes.”), a bulldozer, a grader, a drill rig, a pile driver or a diaphragm wall cutter, wherein the component comprises at least one degree of freedom (see at least [0004]; “For example, as regards to excavators wherein there is an upper carriage rotatable relative to a lower carriage, it is very difficult to take into account for example in a positioning of a tip of a bucket a position of a rotation axis of the upper carriage relative to a boom pin that fastens a boom of the excavator to the upper carriage of the excavator.”), comprising: detecting, by a device (see at least [0043]; “The locator may also be or comprise at least one tachymeter, at least one theodolite or at least one laser scanning device.”), a plurality of measurement points (see at least [0044]; “The positioning arrangement further comprises at least one first spot to be located. The positioning arrangement thus comprises one first spot or two or more first spots to be located. The feature the first spot refers to a specific point in the machine which can be preferably individually identified in the machine.”) of the component and/or the construction machine (see at least [0044]; “The at least one first spot may thus be one or more selected points at the carriage 2 of the excavator 1 the position(s) of which is/are to be determined during the carrying out of the measuring procedure.”) to determine position information for the plurality of measurement points of the component and/or the construction machine (see at least [0058]; “The method for measuring a three dimensional location and orientation of the center axis of a first axle in relation to the center axis of a second axle comprises attaching at least one first spot to be located by at least one locator and to be rotatable around the first axle; attaching at least one second spot to be located by at least one locator and to be rotatable around the second axle; measuring by the at least one locator a first set of at least three different position data measurements of each of the at least one first spot…”); determining a 3D model of the component and/or the construction machine based on the position information for the plurality of measurement points (see at least [0060] – [0067], where the processing unit derives the 3D model shown in Figs. 5a – 5c based on the position of the measured spots), However, Vesanen does not explicitly teach that the locator device performing the measurements is mobile, the use of LiDAR, a non-transitory digital storage medium having a computer program stored thereon to perform, or a computer to run said computer program. Vesanen discloses an arrangement for measuring the relative location of axles in a work machine, and Kaiser is directed to 3D modeling of shapes. Kaiser teaches: A non-transitory digital storage medium having a computer program stored thereon (see at least [0044]; “The memory device 804 includes any suitable non-transitory computer-readable medium for storing data, program code, or both.”) to perform a method (see at least [0044]; “FIG. 8 is a diagram of an example of a computing system 800 that can implement aspects of modeling shapes using differentiable, signed distance functions, according to certain embodiments. System 800 includes a processor 802 communicatively coupled to one or more memory devices 804. The processor 802 executes computer-executable program code stored in the memory device 804.”) comprising: detecting, by a LiDAR sensor of a mobile device (see at least [0017]; “Aspects and features of the present disclosure provide 3D modeling software that can be used to edit 3D models represented using differentiable, signed distance functions to maintain editability while displaying changes in a manner that is computing resource efficient and fast. Further, such an application can automatically create such an editable 3D model as a starting point using a reference representation that can be readily obtained and stored in a variety of formats. The reference representation can be obtained, as an example, by scanning a physical object using the light detection and ranging (LiDAR) feature of a mobile device.”), a plurality of measurement points of a component to determine position information for the plurality of measurement points of the component (see at least [0024]; “The LiDAR data is a cloud of individual points reflected from everything on any surface in the scene (including walls, woodwork, etc.), however, the sofa in this example is the object of interest. Some parts of the sofa are not visible to the scanner, resulting in LiDAR data that is incomplete with respect to the physical surfaces of the reference object. Nonetheless, the 3D modeling application 102 transforms the reference representation into a common representation by sampling the available points associated with the reference representation of the object and selecting candidate procedural models corresponding to the reference representation based on the common representation.”); determining a 3D model of the component based on the position information for the plurality of measurement points (see at least [0024]; “Nonetheless, the 3D modeling application 102 transforms the reference representation into a common representation by sampling the available points associated with the reference representation of the object and selecting candidate procedural models corresponding to the reference representation based on the common representation. Optimization is carried out to produce a selected, procedural model, which is used to produce the 3D, editable model, for which corresponding image 204 is rendered after editing input.”) when said computer program is run by a computer (see at least [0044]). Vesanen uses a locator device comprising a laser scanner to measure the position of specific points on a work machine, and Kaiser uses LiDAR to create a reference representation for use in 3D models. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the mobile device of Kaiser to perform the functions of the locator device of Vesanen. Such a modification would have a reasonable expectation of success because both devices use laser-based technologies to map points in three-dimensional space. One of ordinary skill would be motivated to use the mobile device taught by Kaiser for the measurements of Vesanen because the device of Kaiser produces a reference representation suitable for use in 3D models, as taught by Kaiser (see Kaiser at least [0019]; “For example, a LiDAR scanner can create a reference representation from an actual physical object that is similar to the desired model.”). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. /ASHLEY BROWN RAYNAL/Examiner, Art Unit 3648 /VLADIMIR MAGLOIRE/Supervisory Patent Examiner, Art Unit 3648
Read full office action

Prosecution Timeline

Mar 28, 2023
Application Filed
Apr 25, 2025
Non-Final Rejection — §103
Nov 05, 2025
Response Filed
Nov 24, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601844
Satellite Signal Spoofing Detection System
2y 5m to grant Granted Apr 14, 2026
Patent 12578427
SYSTEMS AND METHODS FOR GENERATING INDEPENDENT TRANSMIT AND RECEIVE CALIBRATION MATRICES FOR MIMO RADAR SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Patent 12567909
COHERENT RECEIVING DEVICE AND ANEMOMETRY LIDAR SYSTEM
2y 5m to grant Granted Mar 03, 2026
Patent 12560703
GENERATING POINT CLOUDS BASED UPON RADAR TENSORS
2y 5m to grant Granted Feb 24, 2026
Patent 12554013
AUTOMATIC OBSTACLE AVOIDANCE METHOD, ELECTRONIC DEVICE, AND UNMANNED AERIAL VEHICLE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
78%
Grant Probability
99%
With Interview (+22.7%)
2y 9m
Median Time to Grant
Moderate
PTA Risk
Based on 36 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month