DETAILED ACTION
Response to Amendments/Arguments
Applicant’s response with respect to the double patenting rejection (filing a Terminal Disclaimer) overcomes the double patenting rejection. The double patenting rejection is withdrawn.
Applicant’s response with respect to the objection to claim 1 has been fully considered and is persuasive. The objection is withdrawn.
Applicant’s response with respect to the rejections of the claims 1-20 under 35 U.S.C. 103 has been fully considered but is not persuasive. Specifically, the applicant has amended the claims, and merely asserted that "the cited references, alone or in combination, fail to disclose at least the elements of 'identifying multiple patterns of the objects detected using the sensor data and the location data; analyzing the multiple patterns of the objects detected, wherein the multiple patterns includes a first pattern and a second pattern, and wherein the second pattern matches the first pattern; deriving information about the objects behind the opaque surface using the multiple patterns'" as now recited.
However, the applicant has not elaborated on any part of this amendment, and it is not clear why the applied references do not read on the amended claims. Upon consideration, it does in fact appear that the references (to Watts, in particular) reads on each one of the amended limitations as explained below.
As for the arguments against claims 5-7 and 15-17, it is important to note col. 3, l. 50 to col. 4, l. 34 of the Watts reference. Specifically, here Watts is understood to teach that it is possible to identify specific types of building features within a wall based on the types of sensors that returned signals, and based on physical characteristics obtained from sensor data. The example given is that studs can be identified by their material, width in a wall, and also by the fact that studs are known to be spaced apart at prescribed intervals. ("In other words, if an object is detected that is about 1.5 inches wide, controller 82 may interpret such object to be a stud. Similarly, controller 82 notes that two objects are about 16, 24 or 36 inches apart and made of wood, as wood studs are normally provided at such distances, controller 82 can increase the confidence level for both objects."). Watts states more broadly that "Persons skilled in the art will recognize that controller 82 can also identify the different objects by their width or distance between items." The table in col. 3-4 of Watts lists wood studs, metal studs, ferrous metal pipe, non-ferrous metal pipe, electric wire, and PVC pipe (empty and full) as detectable types of objects, and in this light, clearly, Watts' identification technique is applicable to at least wood studs, metal studs, ferrous metal pipe, non-ferrous metal pipe, electric wire, and PVC pipe (empty and full), though it is no intuitive leap that other objects may similarly be detected based on how they are known to exist within walls.
It is noted that these rejections are based on combinations of references, and Smoot teaches the that it is advantageous to be able to perform a forensic determination of whether a building code has been met using information on multiple objects behind an opaque surface (col. 1, ll. 44-67). It was found to be obvious before the effective filing date of the claimed invention to modify the teachings of Watts and Rhead such that they similarly involves performing a determination of whether a building code has been met using the information among multiple objects behind the opaque surface. One of ordinary skill in the art would be motivated to do so in order to advantageously non-destructively verify construction techniques (see col. 1, ll. 60-67 of Smoot).
Two such building codes are Chapter 23 of the 2018 International Building Code (IBC) (August 2017) by the International Code Council, and specification GA-216-2010 on the Application and Finishing of Gypsum Panel Products (2010) by the Gypsum Association.
With respect to the former of these, section 2306.3 ("Wood-frame shear walls") in chapter 23 of the 2018 International Building Code teaches spacings for fasteners applied to plywood panels in wood-frame shear walls. In particular, see table 2306.3(1). In addition, in section 2306.3 ("Wood-frame shear walls"), spacings for fasteners applied to plywood panels in wood-frame shear walls. In particular, see table 2306.3(1). The examiner maintains that it would be obvious to use knowledge of how fasteners are to be applied, and how such boards are to be installed, to analyze said boards or particular features thereof from sensor data, and specifically to use this knowledge to determine whether building code has been met based on the analysis as recited in amended claims 5 and 15, and to determine whether studs are placed according to the building code as recited in amended claims 6 and 16.
Similar applies to claim 7 and 17. Here, specification GA-216-2010 on the Application and Finishing of Gypsum Panel Products (2010) by the Gypsum Association is relied upon. This document states, in 4.6.4, that "All ends and edges of gypsum panel products, except those described in Sections 4.6.4.1, 4.6.4.2, and 4.6.4.3, shall be located over framing members or other solid backing," and states, in 4.8.2, that "Fasteners at gypsum panel product edges or ends shall be located not less than 3/8 in. (10 mm) from the edge or end." Also see the nailing patterns in fig. 6 & 7. The examiner maintains that it would be obvious to use knowledge of how such boards are to be installed to analyze said boards or particular features thereof from sensor data, and specifically to use sensor data to analyze the information of fasteners within close proximity from each other and identify an intersection of two adjoining drywall sheets behind the opaque surface as recited in amended claims 7 and 17.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 8, 11-13, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over US 9,194,950 to Watts et al. (hereinafter referred to as Watts) in view of US 10,209,357 to Rhead et al. (hereinafter referred to as Rhead).
With regards to claim 1, Watts teaches a method for identifying objects behind an opaque surface (col. 1, ll. 26-29), comprising:
generating location data by a location tracker (encoder 78 and/or 80), wherein the location data includes data relative to a point of reference that is linked to the opaque surface (encoder 78 and/or 80 provide location data (distance) from a starting point on a surface being scanned; col. 2, ll. 23-43 & col. 4, l. 66 to col. 5, l. 6);
collecting sensor data, by a sensor device (device 10 comprising sensors 84, 85, 86, 87, 88), of the objects behind the opaque surface corresponding to the location data in parallel, wherein the sensor device comprises one or more sensors and is tracked by the location tracker (sensors 84, 85, 86, 87, 88 collects sensor data during movement across the surface tracked by encoder 78 and/or 80 as per at least col. 4, ll. 48-54, col. 4, l. 66 to col. 5, l. 6, col. 5, ll. 17-20, etc.);
storing, in a memory (memory 83), the sensor data and the location data (see at least "As locating device 10 is moved from a starting point, controller 82 stores data related to the detected objects, the distance of the objects relative to the starting point…" in col. 4, l. 66 to col. 5, l. 6; also see claims 9 & 10);
identifying, by one or more processors (controller 82), multiple patterns (locations of all of certain types of objects in a scan path) of the objects detected using the sensor data and the location data (controller 82 operates sensors 84-88 across an opaque wall surface, and using the sensor data and location data, identifies the location and types of multiple different objects detected behind the wall as per at least col. 3, l. 50 to col. 4, l. 34 and col. 5, l. 53 to col. 6, l. 21);
analyzing, by the one or more processors, the multiple patterns of the objects detected, wherein the multiple patterns includes a first pattern (locations of a first type of object) and a second pattern (locations of a second type of object), and wherein the second pattern matches (overlaps, as best understood) the first pattern (controller 82 can detects multiple overlapping objects at the same location, such as metal nails or screws in wood studs, see the example of col. 5, ll. 27-36; controller 82 also analyses detected information to establish a confidence level as to the identity and locations of objects, see at least col. 5, ll. 26-52; here, detecting a small metal object at the same location of a wood object could increase the confidence that the small object is a nail or screw, as these would not be likely to exist by their own in a wall cavity);
deriving information, by the one or more processors, about the objects behind the opaque surface using the multiple patterns (controller 82 analyzes sensor and location data (the patterns) to derive information to display about objects are behind the metal wall as per at least col. 5, l. 27 to col. 6, l. 21); and
communicating, via a user interface (display 12 or other haptic or audio means), the information about the objects behind the opaque surface to a user (col. 5, l. 53 to col. 6, l. 32).
Watts does not expressly teach the location data including pairs of horizontal and vertical location data.
Rhead teaches the feature of tracking a sensor device in a manner so as to provide pairs of horizontal and vertical location data as the sensor device is moved across a surface (a scanning device comprises a position sensor configured to receive a plurality of signals respectively emitted by a plurality of position markers at known positions on a surface, and determines, based on the plurality of angles and the known positions, the position of the scanning device relative to the surface; see the abstract and claim 1). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Watts such that it generates location data as in Rhead and likewise provides pairs of horizontal and vertical location data as the sensor device is moved across a surface. Doing so would provide the predictable benefit of allowing the sensor device to be moved freely across the surface while still generating location data, enabling more freedom of movement during scanning and enabling desired areas to be scanned quickly.
With regards to claim 2, the combination of Watts and Rhead teaches the method of claim 1. In this combination, the generating location data further comprises tracking movements of the sensor device in both horizontal direction and vertical direction (taught by Rhead, e.g., as per fig. 10, 16, etc.); and scanning an area of the opaque surface one time (this is the minimum amount of scanning required to characterize an area of the surface and as such is understood to always be performed when the scanner is used; also, in the example of fig. 10 of Rhead, the surface is scanned in a zig-zag pattern one time), or scanning the area of the opaque surface a predetermined number of times.
With regards to claim 3, the combination of Watts and Rhead teaches the method of claim 1. Watts further teaches the sensor data comprising at least one of: sensor data collected by one or more capacitive sensors (capacitance sensor 84; col. 3, l. 50 to col. 4, l. 26 & col. 4, ll. 48-54); sensor data collected by one or more metal sensors; sensor data collected by one or more current sensors, or a combination thereof.
With regards to claim 8, the combination of Watts and Rhead teaches the method of claim 1. Watts further teaches the feature of communicating the information about the objects behind the opaque surface comprising: retrieving the information about the objects behind the opaque surface from the memory at a later time (stored data is retrieved to generate a level of confidence in the sensing that presented to the user; col. 5, ll. 17-26).
With regards to claim 11, Watts teaches an apparatus for identifying objects behind an opaque surface (col. 1, ll. 26-29), comprising:
a location tracker (encoder 78 and/or 80) configured to generate location data, wherein the location data includes data relative to a point of reference that is linked to the opaque surface (encoder 78 and/or 80 provide location data (distance) from a starting point on a surface being scanned; col. 2, ll. 23-43 & col. 4, l. 66 to col. 5, l. 6);
a sensor device (device 10 comprising sensors 84, 85, 86, 87, 88) configured to collect sensor data of the objects behind the opaque surface, wherein the sensor data corresponds to the location data, and wherein the sensor device comprises one or more sensors and is tracked by the location tracker (sensors 84, 85, 86, 87, 88 collects sensor data during movement across the surface tracked by encoder 78 and/or 80 as per at least col. 4, ll. 48-54, col. 4, l. 66 to col. 5, l. 6, col. 5, ll. 17-20, etc.);
a memory (memory 83) configured to store the sensor data and the location data (see at least "As locating device 10 is moved from a starting point, controller 82 stores data related to the detected objects, the distance of the objects relative to the starting point…" in col. 4, l. 66 to col. 5, l. 6; also see claims 9 & 10);
one or more processors (controller 82) configured to identify multiple patterns (locations of all of certain types of objects in a scan path) of the objects detected using the sensor data and the location data (controller 82 operates sensors 84-88 across an opaque wall surface, and using the sensor data and location data, identifies the location and types of multiple different objects detected behind the wall as per at least col. 3, l. 50 to col. 4, l. 34 and col. 5, l. 53 to col. 6, l. 21), analyze the multiple patterns of the objects detected, wherein the multiple patterns includes a first pattern (locations of a first type of object) and a second pattern (locations of a second type of object), and wherein the second pattern matches (overlaps, as best understood) the first pattern (controller 82 can detects multiple overlapping objects at the same location, such as metal nails or screws in wood studs, see the example of col. 5, ll. 27-36; controller 82 also analyses detected information to establish a confidence level as to the identity and locations of objects, see at least col. 5, ll. 26-52; here, detecting a small metal object at the same location of a wood object could increase the confidence that the small object is a nail or screw, as these would not be likely to exist by their own in a wall cavity), and derive information about the objects behind the opaque surface using the multiple patterns (controller 82 analyzes sensor and location data (the patterns) to derive information to display about objects are behind the metal wall as per at least col. 5, l. 27 to col. 6, l. 21); and
a user interface (display 12 or other haptic or audio means) configured to communicate the information about the objects behind the opaque surface to a user (col. 5, l. 53 to col. 6, l. 32).
Watts does not expressly teach the location data including pairs of horizontal and vertical location data.
Rhead teaches the feature of tracking a sensor device in a manner so as to provide pairs of horizontal and vertical location data as the sensor device is moved across a surface (a scanning device comprises a position sensor configured to receive a plurality of signals respectively emitted by a plurality of position markers at known positions on a surface, and determines, based on the plurality of angles and the known positions, the position of the scanning device relative to the surface; see the abstract and claim 1). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of Watts such that it generates location data as in Rhead and likewise provides pairs of horizontal and vertical location data as the sensor device is moved across a surface. Doing so would provide the predictable benefit of allowing the sensor device to be moved freely across the surface while still generating location data, enabling more freedom of movement during scanning and enabling desired areas to be scanned quickly.
With regards to claim 12, the combination of Watts and Rhead teaches the apparatus of claim 11. In this combination, the location tracker is further configured to: track movements of the sensor device in both horizontal direction and vertical direction (taught by Rhead, e.g., as per fig. 10, 16, etc.); and scan an area of the opaque surface one time (this is the minimum amount of scanning required to characterize an area of the surface and as such is understood to always be performed when the scanner is used; also, in the example of fig. 10 of Rhead, the surface is scanned in a zig-zag pattern one time), or scan the area of the opaque surface a predetermined number of times.
With regards to claim 13, the combination of Watts and Rhead teaches the apparatus of claim 11. Watts further teaches the sensor data comprising at least one of: sensor data collected by one or more capacitive sensors (capacitance sensor 84; col. 3, l. 50 to col. 4, l. 26 & col. 4, ll. 48-54); sensor data collected by one or more metal sensors; sensor data collected by one or more current sensors, or a combination thereof.
With regards to claim 18, the combination of Watts and Rhead teaches the apparatus of claim 11. Watts further teaches the feature of the one or more processors being further configured to: retrieve the information about the objects behind the opaque surface from the memory at a later time (stored data is retrieved to generate a level of confidence in the sensing that presented to the user; col. 5, ll. 17-26).
Claims 4, 14, 21, and 22 are rejected under 35 U.S.C. 103 as being unpatentable over Watts and Rhead as applied to claims 1 and 11 above, and further in view of US 9,903,975 to Smoot.
With regards to claim 4, the combination of Watts and Rhead teaches the method of claim 1. However, this combination does not expressly teach the method further comprising: performing forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface.
Smoot teaches the that it is advantageous to be able to perform a forensic determination of whether a building code has been met using information on multiple objects behind an opaque surface (col. 1, ll. 44-67). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Watts and Rhead such that the method similarly involves performing forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface. One of ordinary skill in the art would be motivated to do so in order to advantageously non-destructively verify construction techniques (see col. 1, ll. 60-67 of Smoot).
With regards to claim 14, the combination of Watts and Rhead teaches the apparatus of claim 11. However, this combination does not expressly teach the one or more processors being further configured to: perform forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface.
Smoot teaches the that it is advantageous to be able to perform a forensic determination of whether a building code has been met using information on multiple objects behind an opaque surface (col. 1, ll. 44-67). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of Watts and Rhead such that it similarly involves the one or more processor performing forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface. One of ordinary skill in the art would be motivated to do so in order to advantageously non-destructively verify construction techniques (see col. 1, ll. 60-67 of Smoot).
With regards to claim 21, the combination of Watts and Rhead teaches the method of claim 1. However, this combination does not expressly teach the method further comprising: planning a future project using the information among multiple objects behind the opaque surface.
Smoot teaches the feature of planning a future project using information on objects behind an opaque surface (col. 1, ll. 20-43). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Watts and Rhead such that it comprises planning a future project using the information among multiple objects behind the opaque surface (e.g., determine where to mount a heavy object or plan a wire or pipe run based thereon) as in Smoot. One of ordinary skill in the art would be motivated to do so in order to minimize the amount of future work to be performed or make said work more efficient (see col. 1, ll. 40-43 of Smoot).
With regards to claim 22, the combination of Watts and Rhead teaches the apparatus of claim 11. However, this combination does not expressly teach the one or more processors being further configured to: plan a future project using the information among multiple objects behind the opaque surface.
Smoot teaches the feature of planning a future project using information on objects behind an opaque surface (col. 1, ll. 20-43). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of Watts and Rhead such that the one or more processors are further configured to: plan a future project using the information among multiple objects behind the opaque surface (e.g., determine where to mount a heavy object or plan a wire or pipe run based thereon) as in Smoot. One of ordinary skill in the art would be motivated to do so in order to minimize the amount of future work to be performed or make said work more efficient (see col. 1, ll. 40-43 of Smoot).
Claims 5, 6, 15, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Watts, Rhead, and Smoot as applied to claims 4 and 14 above, and further in view of Chapter 23 of the 2018 International Building Code (IBC) (August 2017) by the International Code Council.
With regards to claim 5, the combination of Watts, Rhead, and Smoot teaches the method of claim 4. In this combination, performing forensic determination would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. This combination does not expressly teach that the performing forensic determination comprises: using the sensor data and the location data to analyze nailing patterns of plywood sheets in a structural shear wall behind the opaque surface; and determining whether the building code has been met based on the analysis.
Section 2306.3 ("Wood-frame shear walls") in chapter 23 of the 2018 International Building Code teaches spacings for fasteners applied to plywood panels in wood-frame shear walls. In particular, see table 2306.3(1). Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus taught by Watts, Rhead, and Smoot such that the performing forensic determination comprises: using the sensor data and the location data to analyze nailing patterns of plywood sheets in a structural shear wall behind the opaque surface (determine a fastening pattern based on sensor data); and determining whether the building code has been met based on the analysis (i.e., determine whether the detected fastening pattern satisfies section 2306.3 of the 2018 IBC). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate appropriate plywood sheets in a structural shear wall without damaging the opaque surface.
With regards to claim 6, the combination of Watts, Rhead, and Smoot teaches the method of claim 4. In this combination, performing forensic determination would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. This combination does not expressly teach that the performing forensic determination comprises: using the sensor data and the location data to analyze the information of distances between studs; and determining whether the studs were placed according to the building code.
Section 2308.5.1 ("1 Stud size, height and spacing") in chapter 23 of the 2018 International Building Code teaches spacings for wood studs within walls. In particular, see table 2308.5.1. Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method taught by Watts, Rhead, and Smoot such that the performing forensic determination comprises: using the sensor data and the location data to analyze the information of distances between studs (determine a stud spacing based on sensor data); and determining whether the studs were placed according to the building code (i.e., determine whether the detected spacing satisfies section 2308.5.1 of the 2018 IBC). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate hidden stud spacings without damaging the opaque surface.
With regards to claim 15, the combination of Watts, Rhead, and Smoot teaches the apparatus of claim 14. In this combination, analyzing would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. However, this combination does not expressly teach the one or more processors being further configured to: use the sensor data and the location data to analyze nailing patterns of plywood sheets in a structural shear wall behind the opaque surface; and determine whether the building code has been met based on the analysis.
Section 2306.3 ("Wood-frame shear walls") in chapter 23 of the 2018 International Building Code teaches spacings for fasteners applied to plywood panels in wood-frame shear walls. In particular, see table 2306.3(1). Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus taught by Watts, Rhead, and Smoot such that the one or more processors are further configured to: use the sensor data and the location data to analyze nailing patterns of plywood sheets in a structural shear wall behind the opaque surface (determine a fastening pattern based on sensor data); and determine whether the building code has been met based on the analysis (i.e., determine whether the detected fastening pattern satisfies section 2306.3 of the 2018 IBC). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate appropriate plywood sheets in a structural shear wall without damaging the opaque surface.
With regards to claim 16, the combination of Watts, Rhead, and Smoot teaches the apparatus of claim 14. In this combination, analyzing would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. However, this combination does not expressly teach the one or more processors being further configured to: use the sensor data and the location data to analyze the information of distances between studs; and determine whether the studs were placed according to the building code.
Section 2308.5.1 ("1 Stud size, height and spacing") in chapter 23 of the 2018 International Building Code teaches spacings for wood studs within walls. In particular, see table 2308.5.1. Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus taught by Watts, Rhead, and Smoot such that the one or more processors are further configured to: use the sensor data and the location data to analyze the information of distances between studs (determine a stud spacing based on sensor data); and determine whether the studs were placed according to the building code (i.e., determine whether the detected spacing satisfies section 2308.5.1 of the 2018 IBC). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate hidden stud spacings without damaging the opaque surface.
Claims 7 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Watts and Rhead as applied to claim 1 and 11 above, and further in view of Smoot and specification GA-216-2010 on the Application and Finishing of Gypsum Panel Products (2010) by the Gypsum Association.
With regards to claim 7, the combination of Watts and Rhead teaches the method of claim 1. In this combination, analyzing would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. This combination does not expressly teach that the deriving information about the objects behind the opaque surface comprises: using the sensor data and the location data to analyze the information of fasteners within close proximity from each other; and identifying an intersection of two adjoining drywall sheets behind the opaque surface.
Smoot teaches the that it is advantageous to be able to perform a determination of whether a building code has been met using information on multiple objects behind an opaque surface (col. 1, ll. 44-67). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Watts and Rhead such that the method similarly involves performing forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface. One of ordinary skill in the art would be motivated to do so in order to advantageously non-destructively verify construction techniques (see col. 1, ll. 60-67 of Smoot).
One such building code document is specification GA-216-2010 on the Application and Finishing of Gypsum Panel Products (2010) by the Gypsum Association states, in 4.6.4, that "All ends and edges of gypsum panel products, except those described in Sections 4.6.4.1, 4.6.4.2, and 4.6.4.3, shall be located over framing members or other solid backing," and states, in 4.8.2, that "Fasteners at gypsum panel product edges or ends shall be located not less than 3/8 in. (10 mm) from the edge or end." Also see the nailing patterns in fig. 6 & 7. Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Watts, Rhead, and Smoot such that the deriving information about the objects behind the opaque surface comprises: using the sensor data and the location data to analyze the information of fasteners within close proximity from each other (determine a fastening pattern based on sensor data); and identifying an intersection of two adjoining drywall sheets behind the opaque surface (i.e., identify where two rows or columns of fasteners indicate a joint between drywall sheets, based on knowing that fasteners must be located a given amount from edges). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate whether the drywall sheets are located according to code, such as not having gaps larger than ¼ in therebetween (see 4.6.8 of GA-216-2010).
With regards to claim 17, the combination of Watts and Rhead teaches the apparatus of claim 11. In this combination, analyzing would use the sensor data and the location data as a matter of course as all information analyzed and presented by Watts is based thereon. However, this combination does not expressly teach the one or more processors being further configured to: use the sensor data and the location data to analyze the information of fasteners within close proximity from each other; and identify an intersection of two adjoining drywall sheets behind the opaque surface.
Smoot teaches the that it is advantageous to be able to perform a determination of whether a building code has been met using information on multiple objects behind an opaque surface (col. 1, ll. 44-67). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of Watts and Rhead such that it similarly involves the one or more processor performing forensic determination of whether a building code has been met using the information among multiple objects behind the opaque surface. One of ordinary skill in the art would be motivated to do so in order to advantageously non-destructively verify construction techniques (see col. 1, ll. 60-67 of Smoot).
One such building code document is specification GA-216-2010 on the Application and Finishing of Gypsum Panel Products (2010) by the Gypsum Association states, in 4.6.4, that "All ends and edges of gypsum panel products, except those described in Sections 4.6.4.1, 4.6.4.2, and 4.6.4.3, shall be located over framing members or other solid backing," and states, in 4.8.2, that "Fasteners at gypsum panel product edges or ends shall be located not less than 3/8 in. (10 mm) from the edge or end." Also see the nailing patterns in fig. 6 & 7. Given that, as noted above, Smoot teaches that it is advantageous to be able to non-destructively perform a forensic determination of whether a building code has been met using information objects behind an opaque surface, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Watts, Rhead, and Smoot such that the one or more processors are further configured to: use the sensor data and the location data to analyze the information of fasteners within close proximity from each other (determine a fastening pattern based on sensor data); and identify an intersection of two adjoining drywall sheets behind the opaque surface (i.e., identify where two rows or columns of fasteners indicate a joint between drywall sheets, based on knowing that fasteners must be located a given amount from edges). One of ordinary skill in the art would be motivated to do so in order to be able to evaluate whether the drywall sheets are located according to code, such as not having gaps larger than ¼ in therebetween (see 4.6.8 of GA-216-2010).
Claims 9-10 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Watts and Rhead as applied to claims 1 and 11 above, and further in view of US 8,854,043 to Candy et al. (hereinafter referred to as Candy).
With regards to claim 9, the combination of Watts and Rhead teaches the method of claim 8. However, this combination does not expressly teach the method comprising at least one of: displaying the information about the objects behind the opaque surface as a heat map; displaying the information about the objects behind the opaque surface as a contour map; displaying one or more user selected types of material behind the opaque surface; or a combination thereof.
Candy teaches the feature of displaying information about the locations obscured objects and corresponding signal intensities in the form of a heat map (col. 7, l. 40 to col. 8, l. 3). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the method of Watts and Rhead such that the method comprises displaying the information about the objects behind the opaque surface as a heat map, in a manner similar to Candy. Such display would merely provide one known way for displaying information about the objects behind the opaque surface, and nothing about the scanning/analysis functionality in the base references would change. The result of this combination would thus be predictable to one of ordinary skill in the art, and this combination accordingly amounts to no more than the predictable use of prior-art elements according to their established functions.
With regards to claim 10, the combination of Watts, Rhead, and Candy teaches the method of claim 9. This combination further teaches displaying location data collected by the location tracker and sensor data collected by the sensor device on a topographic grid (the display of sensor data according to horizontal/vertical location data would be on a horizontal/vertical grid (when Watts is modified as above) related to the arrangement of physical features/objects behind the opaque surface and thus is understood to be topographic); and showing objects in relation to each other and their grid positions in the space behind the opaque surface (in fig. 4B, Watts teaches displaying objects in the space behind the opaque surface in relation to one another on display 12; Rhead teaches the same in fig. 16, etc.).
With regards to claim 19, the combination of Watts and Rhead teaches the apparatus of claim 18. However, this combination does not expressly teach the user interface being further configured to: display the information about the objects behind the opaque surface as a heat map; display the information about the objects behind the opaque surface as a contour map; display one or more user selected types of material behind the opaque surface; or a combination thereof.
Candy teaches the feature of displaying information about the locations obscured objects and corresponding signal intensities in the form of a heat map (col. 7, l. 40 to col. 8, l. 3). It would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the apparatus of Watts and Rhead such that the user interface displays the information about the objects behind the opaque surface as a heat map, in a manner similar to Candy. Such display would merely provide one known way for displaying information about the objects behind the opaque surface, and nothing about the scanning/analysis functionality in the base references would change. The result of this combination would thus be predictable to one of ordinary skill in the art, and this combination accordingly amounts to no more than the predictable use of prior-art elements according to their established functions.
With regards to claim 20, the combination of Watts, Rhead, and Candy teaches the apparatus of claim 19. This combination further teaches the user interface being further configured to: display location data collected by the location tracker and sensor data collected by the sensor device on a topographic grid (the display of sensor data according to horizontal/vertical location data would be on a horizontal/vertical grid (when Watts is modified as above) related to the arrangement of physical features/objects behind the opaque surface and thus is understood to be topographic); and show objects in relation to each other and their grid positions in the space behind the opaque surface (in fig. 4B, Watts teaches displaying objects in the space behind the opaque surface in relation to one another on display 12; Rhead teaches the same in fig. 16, etc.).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 10,401,532 to Sgarz et al. discloses a related imaging location device.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to James Split whose telephone number is (571)270-1524. The examiner can normally be reached Monday to Friday, 9:00 to 3:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Judy Nguyen can be reached at (571)272-2258. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JS/Examiner, Art Unit 2858
/JUDY NGUYEN/Supervisory Patent Examiner, Art Unit 2858