Prosecution Insights
Last updated: April 19, 2026
Application No. 17/981,377

CONSTRUCTION LAYOUT USING ROBOTIC MARKING AND AUGMENTED REALITY METADATA OVERLAYS

Non-Final OA §103§112
Filed
Nov 04, 2022
Examiner
HANN, JAY B
Art Unit
2186
Tech Center
2100 — Computer Architecture & Software
Assignee
Trimble Inc.
OA Round
1 (Non-Final)
61%
Grant Probability
Moderate
1-2
OA Rounds
3y 5m
To Grant
95%
With Interview

Examiner Intelligence

Grants 61% of resolved cases
61%
Career Allow Rate
281 granted / 463 resolved
+5.7% vs TC avg
Strong +34% interview lift
Without
With
+34.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
31 currently pending
Career history
494
Total Applications
across all art units

Statute-Specific Performance

§101
21.5%
-18.5% vs TC avg
§103
35.9%
-4.1% vs TC avg
§102
13.7%
-26.3% vs TC avg
§112
24.9%
-15.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 463 resolved cases

Office Action

§103 §112
DETAILED ACTION Claims 1-20 are presented for examination. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings received on 4 November 2022 are objected to as color drawings without a granted petition under 37 CFR 1.84(a)(2). Figures 3 and 4 include the color red. Color photographs and color drawings are not accepted in utility applications unless a petition filed under 37 CFR 1.84(a)(2) is granted. Any such petition must be accompanied by the appropriate fee set forth in 37 CFR 1.17(h), one set of color drawings or color photographs, as appropriate, if submitted via EFS-Web or three sets of color drawings or color photographs, as appropriate, if not submitted via EFS-Web, and, unless already present, an amendment to include the following language as the first paragraph of the brief description of the drawings section of the specification: The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee. Color photographs will be accepted if the conditions for accepting color drawings and black and white photographs have been satisfied. See 37 CFR 1.84(b)(2) and MPEP §608.02. If applicant does not wish to file a petition under 37 CFR 1.84(a)(2) and amend the brief description of the drawings section of the specification as noted above, applicant may file replacement black and white line drawings in compliance with 37 CFR 1.84 and 1.121(d). See MPEP §608.02. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-3 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. Claim 1 recites “aligning the internal map of the augmented-reality device.” The limitation "the internal map has insufficient antecedent basis in the claim. Dependent claims 2 and 3 are rejected for depending from a rejected claim. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-3 Claims 1-3 are rejected under 35 U.S.C. 103 as being unpatentable over US 10,713,607 B2 Pettersson, et al. [herein “Pettersson”] in view of US patent 10,788,323 B2 Singer [herein “Singer”] and Zollmann, S., et al. “Augmented Reality for Construction Site Monitoring and Documentation” Proceedings of IEEE, vol. 102, no. 2 (2014) [herein Zollmann”] and alternatively further in view of Degani, A., et al. “An Automated System for Projection of Interior Construction Layouts” IEEE Transactions on Automation Science & Engineering, vol. 16, no. 4 (2019) [herein “Degani”]. Claim 1 recites “1. A method for construction layout using robotic marking and augmented-reality metadata.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 1 further recites “the method comprising: receiving data from a building plan, wherein the building plan includes a plurality of elements.” Pettersson column 12 lines 2-10 teach: storing a general construction database 2, the database comprising structured datasets of a plurality of object entities of physical construction components, in particular wherein the datasets comprise digital construction plan information (e.g. 2D or 3D information, or CAD information) and attribute information of the object entities, a hierarchical structure of the object entities representing a desired construction result to be established by the physical construction components, Claim 1 further recites “marking out points in a construction site related to the plurality of elements in the building plan, wherein: marking out points is performed by a robotic device.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Painting markings is marking out points. Claim 1 further recites “and points are marked in the construction site within a pre-established accuracy using a robotic total station with the robotic device.” Pettersson column 13 lines 9-11 disclose "the mobile support system might also be adapted both for taking control measurements and for tagging markings at the same time, i.e. in one go.” Pettersson column 14 lines 25-26 disclose “regarding special requirements for the accuracy and small tolerance levels.” A tolerance level corresponds with a pre-established required accuracy. Pettersson column 15 lines 5-9 disclose “the mobile support system 7' comprises a temperature and humidity detector 19' and a generic laser based surveying device 20 for measuring spatial points on an arbitrarily shaped surface and for measuring edges and corners of adjacent surfaces.” A person of ordinary skill in the art would understand a laser-based surveying device corresponds with a total station. The mobile support system corresponds with the robotic device. Claim 1 further recites “marks are made using a permanent marker or spray paint.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. Painted marking are marks made using a spray paint. Claim 1 further recites “metadata of points are not marked using the robotic device.” Pettersson column 13 lines 51-57 disclose “Additional information about the construction and the work package is given as human readable instructions 10 or as code instructions, particularly by pictograms 11, e.g. indicating painting of the floor with a corresponding color, or by machine readable codes 12, e.g. representing guiding points or specific settings for an automated or semi-automated executing device.” The additional information in the form of human readable instructions or code instructions correspond with metadata of various points which are not marked by the tags. Claim 1 further recites “the robotic device operates in a global navigation satellite system (GNSS) restricted environment.” Pettersson column 13 lines 13-17 disclose “the mobile support system 5,5',7 is provided with specific location and orientation information stored on the construction database. For example, this might include GPS coordinates, 3D coordinates of an internal construction site coordinate system, 3D building information.” GPS coordinates correspond with a GNSS environment. Claim 1 further recites “accuracy of marking points in the construction site, compared to the building plan, is equal to or better than 5 centimeters.” Pettersson column 13 lines 13-17 disclose “GPS coordinates.” But Pettersson and Singer does not explicitly disclose accuracy better than 5 centimeters; however, in analogous art of augmented reality for construction site monitoring and documentation, Zollmann page 142 left column first paragraph teaches “As a GPS sensor, we use an L1/L2 real-time kinematic (RTK) receiver that measures the device’s position within centimeter accuracy.” Zollmann page 144 left column second paragraph teaches “we apply a registration method that is able to achieve registration accuracy in the centimeter and subangle range.” Centimeter accuracy is an accuracy of better than 5 centimeters. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson, Singer, and Zollmann. One having ordinary skill in the art would have found motivation to use GPS with centimeter level accuracy into the system of construction site management with marking robot for the advantageous purpose of “allows for the access of information right where it is needed.” See Zollman abstract. Claim 1 further recites “and a number of points marked is equal to or greater than 500.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. Plural markings are at least two. Furthermore, if it is advantageous to use the mobile support system at least once for the markings then it is obvious to use it again for the same reasons to achieve the same effect. There is no limitation regarding how long a time period the at least 500 points are marked and it is obvious that over a long enough time period and over enough construction projects the number of marked points will exceed 500 points. This obviousness rationale is supported by MPEP §2144.04(VI)(B) “the court held that mere duplication of parts has no patentable significance unless a new and unexpected result is produced.” Citing In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960). Alternatively, in analogous art of managing construction layout information, Degani page 1826 left column second paragraph teaches “Robotic marking systems are restricted to environments where the floors are clean and clear, for marking and travel, and where the quantity of layout work is large enough to justify their setup costs.” Examiner considers “large enough” to denote an amount which is at least similar to 500 or more. MPEP §2144.05(I) states “a prima facie case of obviousness exists where the claimed ranges or amounts do not overlap with the prior art but are merely close.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson, Singer, Zollmann and Degani. One having ordinary skill in the art would have found motivation to use marking at least 500 points into the system of construction site management with marking robot for the advantageous purpose “to justify their setup costs.” See Degani page 1826 left column second paragraph. Claim 1 further recites “transmitting data from the building plan to an augmented-reality device, wherein data from the building plan includes metadata about the points.” Pettersson column 13 lines 51-57 disclose “Additional information about the construction and the work package is given as human readable instructions 10 or as code instructions, particularly by pictograms 11, e.g. indicating painting of the floor with a corresponding color, or by machine readable codes 12, e.g. representing guiding points or specific settings for an automated or semi-automated executing device.” The additional information in the form of human readable instructions or code instructions correspond with metadata of various points. Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” Singer column 8 lines 57-62 teach: Accordingly, the AR-glasses may store, or receive the AR-data from a server, i.e. the augmented information displayed for assisting the user. These AR-data are spatially related to the reference system, in particular to the BIM model. The BIM-model may optionally also be stored on the AR-device or retrieved from the server. The Building Information Model (BIM) corresponds with building plan data. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 1 further recites “aligning the internal map of the augmented-reality device to the construction site by comparing locations of points in the internal map to data from the building plan.” Singer column 9 lines 24-30 teach: Alternatively to the act of referencing the AR-device relative to the reference system by means of the referencing marker, the process can also be understood in such a way that the AR-device does not really "lock" in into the reference system but merely determines a spatial relationship between the referencing marker and AR-data assigned to the referencing marker. Determining a spatial relationship between a reference marker and AR data is a comparison between points in the internal map and data from the building plan. Singer column 10 lines 4-8 teach “the projector 20 to project the marker accordingly on a surface adjacent to where the AR-data are linked, in particular wherein the AR-data and the projection of the referencing marker have the same position within the reference system or BIM-model.” The AR data and reference system or BIM model having linked same positions corresponds with the AR data being aligned with the building plan information. The reference marker and/or AR data correspond with the interna map. Claim 1 further recites “and presenting metadata about the points to a user of the augmented-reality device as the user looks at the points in the construction site.” Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” The AR data being overlaid onto the real view of the scene is a presentation of the metadata to a user as the user looks at the construction site. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 2 further recites “2. The method of claim 1, wherein the augmented-reality device is a first augmented-reality device, and the method further comprises transmitting data from the building plan to a second augmented-reality device.” Singer column 6 lines 50-51 disclose “Augmented Reality (AR)-device according to the invention, i.e. AR-glasses 10 and an AR-helmet 11.” Singer column 6 lines 59-67 teach: The display shown in the two examples of FIGS. 1a and 1b may comprise a projector (not shown) for projecting the AR-data onto the display 120/121. Other embodiments of the AR-device according to the invention are handheld devices such as smart phones or tablet computers. Such handhelds usually also comprise a visual sensor (camera), a computer (processor) and a display (screen) and are configured to display referenced AR-data. Glasses, helmet, projector, smart phones, and tablet computers are each respective augmented-reality devices. Singer column 2 lines 1-3 teach “Augmented Reality (AR) systems are often utilised for supporting users on a site by referenced visualisations.” Plural users correspond with a recognition there may be more than one user and accordingly a second or more device. Claim 3 further recites “3. The method of claim 2, wherein the first augmented-reality device receives different metadata than the second augmented-reality device.” Singer column 9 lines 17-23 teach: Whatever data distribution option is chosen, once the AR-device 10 is provided with a selection of AR-data 1200, 1201 and with a selection of at least one referencing marker 200, the AR-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device in case the referencing marker 200 is situated within the field of view of the visual sensor 100. Selected AR-data corresponds with respective received metadata. The overlaying being contextual on the referencing marker being situated within a field of view corresponds with different metadata being displayed for different fields of view and different reference markers. Claims 4-6, 9-14, 19, and 20 Claims 4-6, 9-14, 19, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over US 10,713,607 B2 Pettersson, et al. [herein “Pettersson”] in view of US patent 10,788,323 B2 Singer [herein “Singer”]. Claim 4 recites “4. A system for construction layout using robotic marking and augmented-reality metadata.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 4 further recites “the system comprising: a robotic device comprising instructions to mark out points in a construction site within a pre-established accuracy related to a plurality of elements in a building plan.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Painting markings is marking out points. Pettersson column 13 lines 9-11 disclose "the mobile support system might also be adapted both for taking control measurements and for tagging markings at the same time, i.e. in one go.” Pettersson column 14 lines 25-26 disclose “regarding special requirements for the accuracy and small tolerance levels.” A tolerance level corresponds with a pre-established required accuracy. Pettersson column 15 lines 5-9 disclose “the mobile support system 7' comprises a temperature and humidity detector 19' and a generic laser based surveying device 20 for measuring spatial points on an arbitrarily shaped surface and for measuring edges and corners of adjacent surfaces.” A person of ordinary skill in the art would understand a laser-based surveying device corresponds with a total station. The mobile support system corresponds with the robotic device. Claim 4 further recites “and an augmented-reality device comprising instructions to: receive data of the building plan, wherein data of the building plan includes metadata about the points.” Pettersson column 12 lines 2-10 teach: storing a general construction database 2, the database comprising structured datasets of a plurality of object entities of physical construction components, in particular wherein the datasets comprise digital construction plan information (e.g. 2D or 3D information, or CAD information) and attribute information of the object entities, a hierarchical structure of the object entities representing a desired construction result to be established by the physical construction components, Claim 4 further recites “align an internal map of the augmented-reality device to the construction site.” Singer column 9 lines 24-30 teach: Alternatively to the act of referencing the AR-device relative to the reference system by means of the referencing marker, the process can also be understood in such a way that the AR-device does not really "lock" in into the reference system but merely determines a spatial relationship between the referencing marker and AR-data assigned to the referencing marker. Determining a spatial relationship between a reference marker and AR data is a comparison between points in the internal map and data from the building plan. Singer column 10 lines 4-8 teach “the projector 20 to project the marker accordingly on a surface adjacent to where the AR-data are linked, in particular wherein the AR-data and the projection of the referencing marker have the same position within the reference system or BIM-model.” The AR data and reference system or BIM model having linked same positions corresponds with the AR data being aligned with the building plan information. The reference marker and/or AR data correspond with the interna map. Claim 4 further recites “and present metadata about the points to a user of the augmented-reality device as the user looks at points in the construction site.” Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” The AR data being overlaid onto the real view of the scene is a presentation of the metadata to a user as the user looks at the construction site. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 5 further recites “5. The system of claim 4, wherein aligning the internal map of the augmented-reality device to the construction site comprises comparing locations of points in the internal map to data from the building plan.” Singer column 9 lines 24-30 teach: Alternatively to the act of referencing the AR-device relative to the reference system by means of the referencing marker, the process can also be understood in such a way that the AR-device does not really "lock" in into the reference system but merely determines a spatial relationship between the referencing marker and AR-data assigned to the referencing marker. Determining a spatial relationship between a reference marker and AR data is a comparison between points in the internal map and data from the building plan. Singer column 10 lines 4-8 teach “the projector 20 to project the marker accordingly on a surface adjacent to where the AR-data are linked, in particular wherein the AR-data and the projection of the referencing marker have the same position within the reference system or BIM-model.” The AR data and reference system or BIM model having linked same positions corresponds with the AR data being aligned with the building plan information. The reference marker and/or AR data correspond with the interna map. Claim 6 further recites “6. The system of claim 4, wherein instructions of the robotic device do not have the robotic device mark metadata near points.” Pettersson column 13 lines 51-57 disclose “Additional information about the construction and the work package is given as human readable instructions 10 or as code instructions, particularly by pictograms 11, e.g. indicating painting of the floor with a corresponding color, or by machine readable codes 12, e.g. representing guiding points or specific settings for an automated or semi-automated executing device.” The additional information in the form of human readable instructions or code instructions correspond with metadata of various points which are not marked by the tags. Claim 9 further recites “9. The system of claim 4, wherein the robotic device is a projector.” Singer column 6 lines 59-62 teach “The display shown in the two examples of FIGS. 1a and 1b may comprise a projector (not shown) for projecting the AR-data onto the display 120/121.” Claim 10 further recites “10. The system of claim 9, wherein the augmented-reality device is a first augmented-reality device, and the system further comprises a second augmented-reality device configured to receive data of the building plan.” Singer column 6 lines 50-51 disclose “Augmented Reality (AR)-device according to the invention, i.e. AR-glasses 10 and an AR-helmet 11.” Singer column 6 lines 59-67 teach: The display shown in the two examples of FIGS. 1a and 1b may comprise a projector (not shown) for projecting the AR-data onto the display 120/121. Other embodiments of the AR-device according to the invention are handheld devices such as smart phones or tablet computers. Such handhelds usually also comprise a visual sensor (camera), a computer (processor) and a display (screen) and are configured to display referenced AR-data. Glasses, helmet, projector, smart phones, and tablet computers are each respective augmented-reality devices. Singer column 2 lines 1-3 teach “Augmented Reality (AR) systems are often utilised for supporting users on a site by referenced visualisations.” Plural users correspond with a recognition there may be more than one user and accordingly a second or more device. Claim 11 recites “11. A method for construction layout using robotic marking and augmented-reality metadata.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 11 further recites “the method comprising: receiving data from a building plan, wherein the building plan includes a plurality of elements.” Pettersson column 12 lines 2-10 teach: storing a general construction database 2, the database comprising structured datasets of a plurality of object entities of physical construction components, in particular wherein the datasets comprise digital construction plan information (e.g. 2D or 3D information, or CAD information) and attribute information of the object entities, a hierarchical structure of the object entities representing a desired construction result to be established by the physical construction components, Claim 11 further recites “marking out points in a construction site related to the plurality of elements in the building plan, wherein: marking out points is performed by a robotic device.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. A robotic vehicle is robotic. Painting markings is marking out points. Claim 11 further recites “and points are marked in the construction site within a pre-established accuracy.” Pettersson column 13 lines 9-11 disclose "the mobile support system might also be adapted both for taking control measurements and for tagging markings at the same time, i.e. in one go.” Pettersson column 14 lines 25-26 disclose “regarding special requirements for the accuracy and small tolerance levels.” A tolerance level corresponds with a pre-established required accuracy. Claim 11 further recites “transmitting data from the building plan to an augmented-reality device, wherein data from the building plan includes metadata about the points.” Pettersson column 13 lines 51-57 disclose “Additional information about the construction and the work package is given as human readable instructions 10 or as code instructions, particularly by pictograms 11, e.g. indicating painting of the floor with a corresponding color, or by machine readable codes 12, e.g. representing guiding points or specific settings for an automated or semi-automated executing device.” The additional information in the form of human readable instructions or code instructions correspond with metadata of various points. Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” Singer column 8 lines 57-62 teach: Accordingly, the AR-glasses may store, or receive the AR-data from a server, i.e. the augmented information displayed for assisting the user. These AR-data are spatially related to the reference system, in particular to the BIM model. The BIM-model may optionally also be stored on the AR-device or retrieved from the server. The Building Information Model (BIM) corresponds with building plan data. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 11 further recites “aligning an internal map of the augmented-reality device to the construction site.” Singer column 9 lines 24-30 teach: Alternatively to the act of referencing the AR-device relative to the reference system by means of the referencing marker, the process can also be understood in such a way that the AR-device does not really "lock" in into the reference system but merely determines a spatial relationship between the referencing marker and AR-data assigned to the referencing marker. Determining a spatial relationship between a reference marker and AR data is a comparison between points in the internal map and data from the building plan. Singer column 10 lines 4-8 teach “the projector 20 to project the marker accordingly on a surface adjacent to where the AR-data are linked, in particular wherein the AR-data and the projection of the referencing marker have the same position within the reference system or BIM-model.” The AR data and reference system or BIM model having linked same positions corresponds with the AR data being aligned with the building plan information. The reference marker and/or AR data correspond with the interna map. Claim 11 further recites “and presenting metadata about the points to a user of the augmented-reality device as the user looks at the points in the construction site.” Pettersson does not explicitly disclose augmented reality; however, in analogous art of building construction surveying information, Singer column 9 lines 20-21 teaches “the [Augmented-Reality (AR)]-data may be overlaid onto the real view of the scene which the user has when he is wearing the AR-device.” The AR data being overlaid onto the real view of the scene is a presentation of the metadata to a user as the user looks at the construction site. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson and Singer. One having ordinary skill in the art would have found motivation to use augmented reality into the system of construction site management with marking robot for the advantageous purpose of “to display virtual objects with a precise spatial link to the coordinate system, i.e. to the natural environment” and “improving the positioning accuracy of displayed AR-data on the display of an AR-device.” See Singer column 2 lines 10-12 and 41-43. Claim 12 further recites “12. The method of claim 11, wherein aligning the internal map of the augmented-reality device to the construction site comprises comparing locations of points in the internal map to data from the building plan.” Singer column 9 lines 24-30 teach: Alternatively to the act of referencing the AR-device relative to the reference system by means of the referencing marker, the process can also be understood in such a way that the AR-device does not really "lock" in into the reference system but merely determines a spatial relationship between the referencing marker and AR-data assigned to the referencing marker. Determining a spatial relationship between a reference marker and AR data is a comparison between points in the internal map and data from the building plan. Singer column 10 lines 4-8 teach “the projector 20 to project the marker accordingly on a surface adjacent to where the AR-data are linked, in particular wherein the AR-data and the projection of the referencing marker have the same position within the reference system or BIM-model.” The AR data and reference system or BIM model having linked same positions corresponds with the AR data being aligned with the building plan information. The reference marker and/or AR data correspond with the interna map. Claim 13 further recites “13. The method of claim 11, wherein points are marked without marking metadata near points.” Pettersson column 13 lines 51-57 disclose “Additional information about the construction and the work package is given as human readable instructions 10 or as code instructions, particularly by pictograms 11, e.g. indicating painting of the floor with a corresponding color, or by machine readable codes 12, e.g. representing guiding points or specific settings for an automated or semi-automated executing device.” The additional information in the form of human readable instructions or code instructions correspond with metadata of various points which are not marked by the tags. Claim 14 further recites “14. The method of claim 11, wherein transmitting data from the building plan to the augmented-reality device is performed using one or more wireless connections.” Singer column 7 lines 4-7 teach “the AR-device may comprise a wireless communication unit (using e.g. WiFi, Bluetooth, radio link, etc.) for at least one of: connecting to, communicating with, and transferring data from/to a server.” Dependent claims 19 and 20 are substantially similar to claims 2 and 3 above and are rejected for the same reasons. Dependent Claims 7, 15, and 18 Claims 7 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Pettersson and Singer as applied to claims 4 and 11 above, and alternatively further in view of Degani, A., et al. “An Automated System for Projection of Interior Construction Layouts” IEEE Transactions on Automation Science & Engineering, vol. 16, no. 4 (2019) [herein “Degani”]. Claim 7 further recites “7. The system of claim 4, wherein a number of points marked is equal to or greater than 200.” Pettersson column 12 lines 43-48 disclose: The mobile support system might be a ground based robotic vehicle or a flying robotic vehicle (UAV) and the markings might be based on at least one of a 2D marking, e.g. a painted marking, a laser burned marking, or written information, a 3D structure, e.g. by a surface spattering device. Plural markings are at least two. Furthermore, if it is advantageous to use the mobile support system at least once for the markings then it is obvious to use it again for the same reasons to achieve the same effect. There is no limitation regarding how long a time period the at least 200 points are marked and it is obvious that over a long enough time period and over enough construction projects the number of marked points will exceed 200 points. This obviousness rationale is supported by MPEP §2144.04(VI)(B) “the court held that mere duplication of parts has no patentable significance unless a new and unexpected result is produced.” Citing In re Harza, 274 F.2d 669, 124 USPQ 378 (CCPA 1960). Alternatively, in analogous art of managing construction layout information, Degani page 1826 left column second paragraph teaches “Robotic marking systems are restricted to environments where the floors are clean and clear, for marking and travel, and where the quantity of layout work is large enough to justify their setup costs.” Examiner considers “large enough” to denote an amount which is at least similar to 200 or more. MPEP §2144.05(I) states “a prima facie case of obviousness exists where the claimed ranges or amounts do not overlap with the prior art but are merely close.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson, Singer, and Degani. One having ordinary skill in the art would have found motivation to use marking at least 200 points into the system of construction site management with marking robot for the advantageous purpose “to justify their setup costs.” See Degani page 1826 left column second paragraph. Dependent claim 15 is substantially similar to claim 7 above and is rejected for the same reasons. Claim 18 further recites “18. The method of claim 11, wherein at least one of the points is marked on a ceiling.” Pettersson does not explicitly teach marking points on a ceiling; however, in analogous art of managing construction layout information, Degani abstract teaches: The system projects any desired information—drawings and images—onto the work surface (floor, walls, or ceiling) in the correct location, scale, and orientation. The prototype apparatus consists of a laser range scanner, a projector, and a camera. Projection of the work instructions directly onto the work surface is accurate and immediate. Projecting information onto a ceiling work surface is marking points on a ceiling. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson, Singer, and Degani. One having ordinary skill in the art would have found motivation to use marking at least 200 points into the system of construction site management with marking robot for the advantageous purpose because “[p]rojection of the work instructions directly onto the work surface is accurate and immediate.” See Degani abstract. Dependent Claims 8 and 16 Claims 8 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Pettersson and Singer as applied to claims 4 and 11 above, and further in view of Zollmann, S., et al. “Augmented Reality for Construction Site Monitoring and Documentation” Proceedings of IEEE, vol. 102, no. 2 (2014) [herein Zollmann”]. Claim 8 further recites “8. The system of claim 4, wherein the pre-established accuracy is equal to or less than 5 centimeters.” Pettersson column 13 lines 13-17 disclose “GPS coordinates.” But Pettersson and Singer does not explicitly disclose accuracy better than 5 centimeters; however, in analogous art of augmented reality for construction site monitoring and documentation, Zollmann page 142 left column first paragraph teaches “As a GPS sensor, we use an L1/L2 real-time kinematic (RTK) receiver that measures the device’s position within centimeter accuracy.” Zollmann page 144 left column second paragraph teaches “we apply a registration method that is able to achieve registration accuracy in the centimeter and subangle range.” Centimeter accuracy is an accuracy of better than 5 centimeters. It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pettersson, Singer, and Zollmann. One having ordinary skill in the art would have found motivation to use GPS with centimeter level accuracy into the system of construction site management with marking robot for the advantageous purpose of “allows for the access of information right where it is needed.” See Zollman abstract. Dependent claim 16 is substantially similar to claim 8 above and is rejected for the same reasons. Dependent Claim 17 Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Pattersson and Singer as applied to claim 11 above, and further in view of US patent 12,222,723 B2 Yamauchi [herein “Yamauchi”]. Claim 17 further recites “17. The method of claim 11, wherein the robotic device is a quadruped.” Pattersson does not explicitly disclose a quadruped robot; however, in analogous art of construction site robots, Yamauchi column 6 line 67 teaches “FIG. 1A depicts a quadruped robot with four legs.” Yamauchi column 5 lines 62-66 teach “For instance, if the environment for the robot is a construction site, a construction site is a changing environment often including temporary or non-permanent objects, such as tools, tool storage, machinery, material, etc.” Yamauchi column 7 lines 4-6 teach “that provide a means to traverse the terrain within the environment 10.” It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to combine Pattersson, Singer, and Yamauchi. One having ordinary skill in the art would have found motivation to use a quadruped robot with four legs into the system of construction site management with marking robot because quadruped robots have art recognized suitability for the intended purpose of navigating a construction site. See MPEP §2144.07 and Yamauchi column 5 lines 62-66 and column 7 lines 4-6. Examiner Comment Claim 9 recites “wherein the robotic device is a projector.” While not a typographic error per se, Examiner observes Applicant may have intended to instead recite “wherein the robotic device [[is]] further comprises a projector.” Conclusion Prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20230099779 A1 REDGEWELL; Duncan et al. teaches Metrology System US 20130310971 A1 Prouty; Joseph M. Robotic Construction Site Marking Apparatus US 20230237643 A1 SALMON; Richard et al. Augmented Reality System with Interactive Overlay Drawing US 10185787 B1 Côté; Stéphane et al. Tool for accurate onsite model visualization that facilitates environment interaction YouTube “Project Lion - A DPR/Trimble Automated Layout Robot” (April 2013) available at <https://www.youtube.com/watch?v=k-lfL3dA1SE> Development conception of “an automate layout robot” used with robotic total stations. Xu, J. & Moreu, F. “A Review of Augmented Reality Applications in Civil Infrastructure During the 4th Industrial Revolution” Frontiers in Build Environment, vol. 7, article 640732 (June 2021) Page 3 right column third paragraph teaches “Applying AR in the lifecycle construction site planning can help keep the project within budget and to avoid process mistakes.” Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jay B Hann whose telephone number is (571)272-3330. The examiner can normally be reached M-F 10am-7pm EDT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Renee Chavez can be reached at (571) 270-1104. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Jay Hann/Primary Examiner, Art Unit 2186 4 January 2026
Read full office action

Prosecution Timeline

Nov 04, 2022
Application Filed
Jan 04, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12580384
AUTOMATION TOOL TO CREATE CHRONOLOGICAL AC POWER FLOW CASES FOR LARGE INTERCONNECTED SYSTEMS
2y 5m to grant Granted Mar 17, 2026
Patent 12573182
COMPUTER VISION AND SPEECH ALGORITHM DESIGN SERVICE
2y 5m to grant Granted Mar 10, 2026
Patent 12560740
METHOD FOR MODELLING THE FORMATION OF A SEDIMENTARY BASIN USING A STRATIGRAPHIC FORWARD MODELING PROGRAM
2y 5m to grant Granted Feb 24, 2026
Patent 12560741
System and Method to Develop Naturally Fractured Hydrocarbon Reservoirs Using A Fracture Density Index
2y 5m to grant Granted Feb 24, 2026
Patent 12560067
METHOD FOR HYDRAULIC FRACTURING AND MITIGATING PROPPANT FLOWBACK
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
61%
Grant Probability
95%
With Interview (+34.1%)
3y 5m
Median Time to Grant
Low
PTA Risk
Based on 463 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month