Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed 01/28/2026 have been fully considered but they are not persuasive.
Applicant argues Likholyot does not teach an automated object recognition algorithm.
In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
Laur teaches the automated object recognition algorithm at [0029]), not Likholyot, as cited in the previous office action. Thus, this argument is not persuasive.
Applicant argues prior art does not teach identifying overlap based on a feature position and feature extent. Instead, prior art teaches aligning data sets for use in identifying features.
Examiner respectfully disagrees. The feature recognition was taught by Laur, as cited in the previous office action, and not Likholyot, as applicant seems to be arguing against.
Applicant argues that it would be against the teaching of Likholyot to modify in view of Laur and Baak.
Examiner respectfully disagrees. Applicant has provided no evidence for this argument, and thus, it is unpersuasive and is mere allegation of patentability.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Likholyot (US 8699005 B2) in view of Laur (US 20200072943 A1), further in view of Baak (US 20180302611 A1).
Regarding Claim 1, Likholyot teaches a mapping system for mapping an environment, comprising:
a surveying apparatus, including:
2D rangefinder operable to measure at least two 2D emission patterns at an environmental surface (Fig 1, distant surface 3),
A first 2D emission pattern produced from a first location in a first room, a second 2D emission pattern produced from a second location in a second room, the first room and the second room being adjoining rooms (Col 6, lines 46-62 – moving rangefinder to measure two adjoining rooms)
And range information for each pattern (Col 4, lines 45-50),
and at least one imaging sensor operable to capture at least one image of the environmental surface (Fig 1, image sensor 9),
and wherein the surveying apparatus is operable to generate at least one set of configuration data indicative of an image position of the at least one image relative to the at least two 2D emission patterns (Col 7, lines 57-63 - alignment of images with 2D data);
and at least one processor communicatively coupled to the surveying apparatus to receive the at least two 2D emission patterns, the at least one image, and the at least one set of configuration data (Fig 1, computing device 5),
the at least one processor (Fig 1, computing system 5) operable to:
apply an automated matching algorithm to the at least one image and the first 2D emission pattern using the at least one set of configuration data (Col 6, lines 51-61 – bringing all images into common coordinate system, thus establishing position).
identify a first overlap portion of the first 2D emission pattern that overlaps with the second 2D emission pattern based on the feature position and the feature extent of the at least one feature recognized from the at least one image, the first overlap portion being a part of the first 2D emission pattern captured in the second room as seen through the at least one feature; and automatically align the first 2D emission pattern and the second 2D emission pattern using the first overlap portion (Col 6, lines 46-62 – determining overlap between two data sets in different rooms and aligning the data sets),
apply an automated projection algorithm to project the at least one 2D emission pattern onto a horizontal plane to generate a map of the environment (Col 9, lines 5-10),
Likholyot does not teach applying an automated object recognition algorithm to the at least one image in an automated process to recognize at least one feature, a feature position of the at least one feature, and a feature extent of at least one feature, from at least one image, the at least one feature being at least one of a doorway and a window of the environment. Instead, Likholyot teaches that a user input device which allows manual input of doors and windows (Col 9, lines 26-36).
Laur teaches a radar-data collection system which includes a camera (Fig 1, camera 14). The camera image (Fig 1, image 24) is then used for image processing to determine the identity of an object ([0029]). A radar profile is then annotated with the identified object ([0034]).
It would have been obvious to use the object recognition and annotation method, as taught by Laur, with the mapping system as taught by Likholyot because, as Laur says, “Those in the automated-vehicle object-detection arts will recognize that the identity of an object is often more readily determine based on an image from a camera rather than a radar-return from radar” ([0011]).
Likholyot, as modified in view of Laur, doesn’t teach, but Baak does teach, when the at least one feature is a window or a mirror, remove ghost data caused by reflections from the window or the mirror from the at least one 2D emission pattern using the marked position of the window or the mirror on the map ([0003] – describing automated detection and classification along with [0066] – using two images to improve recognition of shiny surfaces like windows).
It would have been obvious before the effective filing date to combine Baak’s method for removing erroneous data from windows with the mapping system as taught by Likholyot, as modified in view of Laur. Specifically, Laur’s two images (camera and radar image) and method of associating an object with a radar profile (as described above) could be used as the two images in Baak’s method. This would mitigate errors brought on by windows or other shiny/reflective objects, as taught by Baak ([0066]).
Regarding Claim 2, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 1, wherein the at least one processor is operable to mark the map to indicate the feature position and the feature extent of the at least one feature (Likholyot Col 9, lines 45-46 – marking extent of doors).
Regarding Claim 3, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 1, wherein the at least one processor is operable to identify a second overlap portion of the second 2D emission pattern that overlaps with the first 2D emission pattern based on the feature position and the feature extent of the at least one feature, the second overlap portion being a part of the second 2D emission pattern captured in the first room as seen through the at least one feature; wherein automatically aligning the first 2D emission pattern and the second 2D emission pattern includes using the first overlap portion and the second overlap portion simultaneously (Likholyot Col 6, lines 44-62 - describing two rangefinder positions and alignment).
Regarding Claim 4, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 3, further comprising at least one of an electronic compass and an inertial measurement unit operable to generate a set of positional information indicative of a emission position of the 2D rangefinder, the at least one processor communicatively coupled to the surveying apparatus to receive the set of positional information and operable to apply the set of positional information when automatically aligning the first and second 2D emission patterns (Likholyot Col 6, lines 11-34 - describing IMU and compass).
Regarding Claim 5, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 1,
wherein the surveying apparatus includes an optical imaging system (Likholyot Fig 1, imaging system 7)
the optical imaging system including the at least one imaging sensor and an objective lens (Likholyot Fig 1, image sensor 9 and objective lens 8)
and the at least one set of configuration data includes at least one calibration coefficient indicative of a focal length and a distortion of the objective lens a sensor position and an orientation of the at least one image sensor relative to the objective lens, and a lens position and an orientation of the objective lens relative to the 2D rangefinder (Likholyot Col 5, lines 28-34).
Regarding Claim 6, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 1, wherein the 2D rangefinder includes a laser 2D rangefinder (Likholyot Col 7, line 47 - "triangulation laser rangefinder).
Regarding Claim 7, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 6, wherein the 2D rangefinder is a scanning laser rangefinder (Likholyot Col 7, line 47 - "triangulation laser rangefinder), the scanning laser rangefinder including a rangefinder sensor (Fig 1, image sensor 9).
Regarding Claim 8, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 6, wherein the 2D rangefinder is a triangulation laser rangefinder, the triangulation laser rangefinder including the at least one imaging sensor (Likholyot Fig 1, image sensor 9).
Regarding Claim 9, Likholyot, as modified in view of Laur, and Baak teaches the mapping system of claim 1, wherein the surveying apparatus includes a stand (Likholyot Fig 1, stand 15) and a rotator (Likholyot Fig 1, rotator 14), the rotator operable to rotate the 2D rangefinder and the at least one imaging sensor relative to the stand, the rotator having a rotation axis that is substantially perpendicular to an optical axis of the at least one imaging sensor (Likholyot Col 8, lines 12-17 - describing rotation).
Regarding Claim 10, Likholyot, as modified in view of Laur and Baak, teaches the mapping system of claim 1, wherein the at least one 2D emission pattern extends at least 180 degrees (Likholyot Col 8, lines 39-43 - two angular positions 180 degrees apart).
Claims 11-18 are method claims corresponding to product claims 1-8. Thus, see rejections above.
Regarding Claim 19, Likholyot, as modified in view of Laur and Baak, teaches the method of claim 11, wherein the 2D emission pattern extends at least 180 degrees (Likholyot Col 8, lines 39-43 - two angular positions 180 degrees apart).
Regarding Claim 20, Likholyot teaches a mapping system for mapping an environment, comprising:
a surveying apparatus, including:
an 2D rangefinder operable to measure a first 2D emission pattern at an at least one environmental surface from a first location and to measure a second 2D emission pattern at the at least one environmental surface from a second location different from the first location (Fig 2, first 2D data set 30 and second 2D data set 32),
each of the first and second 2D emission patterns including range information (Col 6, lines 44-62 - describing two rangefinder positions),
and at least one
imaging sensor operable to capture at least one image of the at least one environmental surface (Fig 1, image sensor 9),
wherein the surveying apparatus is operable to generate at least one set of configuration data indicative of an image position of the at least one image relative to at least one of the first and second 2D emission patterns (Col 7, lines 57-63 - alignment of images with 2D data);
and at least one processor communicatively coupled to the surveying apparatus to receive the first and second 2D emission patterns, the at least one image, and the at least one set of configuration data (Fig 1, computing device 5),
the at least one processor operable to: apply an automated object recognition process to the at least one image to recognize at least one feature from the at least one image (Col 6, lines 51-61 – finding doorway),
the at least one feature being at least one of a doorway, a mirror, and a window of the environment (Col 6, lines 51-61 – finding doorway),
automatically align the first and second 2D emission patterns using the feature position to identify an overlap in the first and second 2D emission patterns, and apply an automated projection process to project the aligned first and second 2D emission patterns onto a horizontal plane to generate the map of the environment (Col 9, lines 5-10).
Likholyot does not teach applying an automated object recognition algorithm to the at least one image in an automated process to recognize at least one feature, a feature position of the at least one feature, and a feature extent of the at least one feature, from at least one image, the at least one feature being at least one of a doorway, a mirror, and a window of the environment. Instead, Likholyot teaches that a user input device which allows manual input of doors and windows (Col 9, lines 26-36).
Laur teaches a radar-data collection system which includes a camera (Fig 1, camera 14). The camera image (Fig 1, image 24) is then used for image processing to determine the identity of an object ([0029]). A radar profile is then annotated with the identified object ([0034]).
It would have been obvious to use the object recognition and annotation method, as taught by Laur, with the mapping system as taught by Likholyot because, as Laur says, “Those in the automated-vehicle object-detection arts will recognize that the identity of an object is often more readily determine based on an image from a camera rather than a radar-return from radar” ([0011]).
Likholyot, as modified in view of Laur, doesn’t teach, but Baak does teach, when the at least one feature is a window or a mirror, remove ghost data caused by reflections from the window or the mirror from the at least one 2D emission pattern using the marked position of the window or the mirror on the map ([0003] – describing automated detection and classification along with [0066] – using two images to improve recognition of shiny surfaces like windows).
It would have been obvious before the effective filing date to combine Baak’s method for removing erroneous data from windows with the mapping system as taught by Likholyot, as modified in view of Laur. Specifically, Laur’s two images (camera and radar image) and method of associating an object with a radar profile (as described above) could be used as the two images in Baak’s method. This would mitigate errors brought on by windows or other shiny/reflective objects, as taught by Baak ([0066]).
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CLARA CHILTON whose telephone number is (703)756-1080. The examiner can normally be reached Monday-Friday 6-2 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Helal Algahaim can be reached at 571-270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CLARA G CHILTON/Examiner, Art Unit 3645
/HELAL A ALGAHAIM/SPE , Art Unit 3645