DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Objections
Claims 7 and 16 objected to because of the following informalities: Claims 7 and 16 recite “the nature of the associated objects”. This should be changed to --a nature of the objects--. Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 6 and 15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 6 and 15 recite “an error variable”. Claims 1 and 10, from which they depend, respectively, also recite “an error variable”. It is unclear whether the error variable from claims 6 and 15 is the same variable or a different variable.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-6 and 10-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shankar et al. (WO 2020/007483) in view of Derhy et al. (US 2020/0011668).
In regard to claim 1, Shankar et al. teach a method for determining a map of a three-dimensional environment, including a plurality of objects, based on at least a first image and a second image of said three-dimensional environment (fig. 2 and page 24 9-11. Relative pose between two frames to triangulate map points), said first image and said second image being respectively acquired by an image-acquisition device having two different poses (element 401 and page 24 27 and 28), the method comprising: determining a plurality of common characteristic elements in the first image and the second image of said three-dimensional environment (fig. 3 and page 21 lines 7-9); determining groups of the characteristic elements by an image analysis such that each of the groups of the characteristic elements corresponds to an object of the three-dimensional environment (fig. 3 bounding box 342 and page 23 lines 1-7); estimating a three-dimensional position of the characteristic elements based on a triangulation between the positions of the characteristic elements in the first image and the positions of the corresponding characteristic elements in the second image, the estimating comprising a sub-step of determining one of the poses of the image-acquisition device (fig. 4 element 435 and page 26 lines 29 and 30); but does not teach for each of the groups of the characteristic elements, optimizing the three-dimensional positions of the characteristic elements of the respective group by minimizing an error variable that is a function, for all the characteristic elements of the respective group, of a distance between a detected position of said characteristic elements from at least the first image and a two-dimensional position obtained by application of a projection function to the corresponding estimated three-dimensional position and determine the map of the three-dimensional environment using the optimized three- dimensional positions of the characteristic elements of the respective group.
Derhy et al. teach for each of the groups of the characteristic elements, optimizing the three-dimensional positions of the characteristic elements of the respective group by minimizing an error variable that is a function, for all the characteristic elements of the respective group, of a distance between a detected position of said characteristic elements from at least the first image and a two-dimensional position obtained by application of a projection function to the corresponding estimated three-dimensional position (figs. 2C-2E and paragraph 47. Derhy et al. teach finding measured 2D positions F0-meas and F1-meas. Paragraph 37 states this map is made for all the features in the frame. Derhy also determines estimated positions by using the 3D coordinates of the camera. These positions are then put into an optimization routine to determine an optimized position. This optimization is used to reduce reprojection error) and determine the map of the three-dimensional environment using the optimized three-dimensional positions of the characteristic elements of the respective group (paragraph 47, Dehry et al. teach determining an optimized pose estimate of the camera and refining the feature map).
The two are analogous art because they both deal with the same field of invention of image detection.
At the time of invention it would have been obvious to one of ordinary skill in the art to provide the apparatus of Shankar et al. with the position optimization of Derhy et al. The rationale is as follows: At the time of invention it would have been obvious to provide the apparatus of Shankar et al. with the position optimization of Derhy et al. because the optimization routine of Derhy et al. would reduce errors caused by noise or other defects.
In regard to claim 2, Shankar et al. teach estimating the three-dimensional position of the characteristic elements is implemented for each of the groups of the characteristic elements based on, for each of the groups of the characteristic elements, the triangulation between the positions of the characteristic elements of the respective group in the first image and the positions of the corresponding characteristic elements in the second image (page 27 lines 1-7).
In regard to claim 3, Shankar et al. teach determining the characteristic elements uses a method of identifying distinctive areas corresponding to shapes present in the first image and the second image (fig. 3 element 350 and page 21 lines 13-17).
In regard to claim 4, Shankar et al. teach determining the groups of the characteristic elements uses a neural network (NN) receiving as an input one or more of the first image and the second image of the three-dimensional environment and outputting a class for each of the pixels of the respective image (page 20 lines 12-24).
In regard to claim 5, Shankar et al. teach wherein the groups of the characteristic elements are obtained from the classes allocated to the characteristic elements (CovvNets are used to provide information regarding a person, car, chair, etc. These are classes of elements).
In regard to claims 6 and 15, Derhy et al. teach optimizing the three-dimensional positions, further optimizing, for all the determined characteristic elements, the optimized three-dimensional position based on a minimization of an error variable that is a function, for all the determined characteristic elements, of a distance between the detected position of said characteristic elements from at least the first image and a two-dimensional position associated with the corresponding optimized three-dimensional position (paragraph 37, Derhy et al. teach the map being made for all the objects in a scene).
In regard to claim 10, this is the apparatus corresponding the method of claim 1 and is rejection for the same reasons.
In regard to claim 11, Shankar et al. teach an image acquisition device (element 240).
In regard to claim 12, Shankar et al. teach wherein the determining the characteristic elements uses a method of identifying distinctive areas corresponding to shapes present in the first image and the second image (page 16 lines 25-30, Shankar et al. teach the keypoints being corners or edges).
In regard to claims 13 and 14, Shankar et al. teach wherein the determining the groups of the characteristic elements uses a neural network (NN) receiving as an input one or more of the first image and the second image of the three-dimensional environment and outputting a class for each of the pixels of the respective image (page 20 ConvNets).
Claim(s) 7 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Shankar et al. in view of Derhy et al. further considered with Chiu et al. (US 11361470).
In regard to claims 7 and 16, Shankar et al. and Derhy et al. teach all the elements of claim 7 except selecting groups of the characteristic elements based on the nature of the associated objects, determining the optimized three-dimensional positions being implemented only for the characteristic elements of the selected groups.
Chiu et al. teach selecting groups of the characteristic elements based on the nature of the associated objects, determining the optimized three-dimensional positions being implemented only for the characteristic elements of the selected groups (column 21 lines 24-28. Chiu et al. teach ignoring static elements).
The two are analogous art because they both deal with the same field of invention of image detection.
At the time of invention it would have been obvious to one of ordinary skill in the art to provide the apparatus of Shankar et al. and Derhy et al. with ignoring objects as shown in Chiu et al. The rationale is as follows: At the time of invention it would have been obvious to provide the apparatus of Shankar et al. and Derhy et al. with ignoring objects as shown in Chiu et al. because ignoring objects would reduce the computational load and would allow for quicker processing and easier navigation.
Allowable Subject Matter
Claims 8, 9 and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is an examiner’s statement of reasons for allowance: The prior art teaches identifying different classes of objects but does not show the error variable being minimized as a function of the type of object in combination with the claim’s other features.
Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.”
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSEPH R HALEY whose telephone number is (571)272-0574. The examiner can normally be reached 7:30am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JOSEPH R HALEY/ Primary Examiner, Art Unit 2621