DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Joint Inventors
This application currently names joint inventors. In considering patentability of the claims, the Examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the Examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Status of Claims
This action is in response to Applicant’s filing on 03/04/2025. Claims 16-34 are pending and examined below.
Claim Objections
Claim 20 is objected to because of the following informalities:
Claim 20 ends with the phrase ‘… and,’ without a period to conclude the claim. Examiner believes that the claim is meant to end after the term “object” (similar to claim 30).
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 16-34 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Regarding claim 16, the claim recites the limitation of “interoceptive sensor”. However, it is unclear what an “interoceptive sensor” actually is as the instant specification makes no mention of what an “interoceptive sensor” is and instead describes “proprioceptive sensors”. As such, claim 16 is rejected under 35 U.S.C. 112(b) for being indefinite. For examination purposes, the examiner is interpreting the “interoceptive sensors” to be “proprioceptive sensors”, as seen in the instant specification. Because this claim positively recites “at least one of an inertial measurement unit (IMU)”, examiner will interpret the limitation of this claim to cover an IMU.
Claims 17-25 ultimately depend from claim 16 and contain the same indefinite subject matter as seen above in claim 16. As such, claims 17-25 are also rejected under 35 U.S.C. 112(b) for being indefinite.
Regarding claims 26 and 27-34, the claim limitations are similar to those in claims 16 and 17-25, respectively, and are rejected using the same rationale as seen above in claims 16 and 17-25.
Additionally, claims 23 and 32 recite the phrase "such as" which renders the claim indefinite because it is unclear whether the limitations following the phrase are part of the claimed invention. See MPEP § 2173.05(d).
Additionally, claims 25 and 34 recite the term “improved accuracy” which is a relative term that renders the claim indefinite. The term “improved accuracy” is not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. The claims purport to ‘improve accuracy’ of the actual position determined by the robot, however, there is no indication of how much the accuracy improves or how wheel slippage is actually used to ‘improve accuracy’. As such, claims 25 and 34 are rejected under 35 U.S.C. 112(b) for being indefinite.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 16-24 and 26-33 are rejected under 35 U.S.C. 103 as being obvious over Liao et al., US 20170277197 A1, herein referred to as Liao, and in view of Pajovic et al., US 20200164508 A1, herein referred to as Pajovic.
Regarding claim 16, Liao discloses capturing a plurality of images at different time instances using a sensing device (Fig. 1, Paragraph 0046; the robot acquires images as it traverses through an environment, each image may be at a different time instance (first image at a first location, etc.)), determining an image disparity between the captured images by spatially transforming a subsequent image to align with an initial image (Fig. 1, Paragraph 0046; robot may compare feature points in different images to determine differences, the robot may determine translations and/or rotations to match up feature points in the image, these translations/rotations may be considered a spatial transform that aligns images based on the given feature points), determining a change in pose of the sensing device due to the robot moving, wherein the change in pose is determined based on the image disparity (Fig. 1, Paragraph 0046; the robot may determine a change in its pose based on variances between feature points in images), determining a first motion of the robot based on the determined change in pose of the sensing device (Figs. 1, 7-8, Paragraphs 0046, 0107; a first change in pose may be determined by the robot, this can be considered the first motion), determining a second motion of the robot based on data from at least one interoceptive sensor (Figs. 1, 7-8, Paragraphs 0046, 0107; visual odometry may be used to determine a second motion of the robot through the environment; visual odometry may be performed using the camera sensors which can be considered ‘interoceptive’ sensors), determining the pose of the sensing device based on a motion disparity between the first motion determined from the change in pose of the sensing device and the second motion of the robot (Figs. 1, 7-8, Paragraphs 0046, 0107-0109; changes in pose may be determined based on image feature matching (first motion) and visual odometry through the camera sensors; the poses determined may be compared and/or updated to a correct pose determination), and localizing an object on a computer readable map (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; obstacles may be located on a depth map), but fails to disclose determining a second motion of the robot based on data from at least one interoceptive sensor, wherein the second motion is determined using a probabilistic distribution of the pose of the robot, and localizing an object on a computer readable map based at least in part on a most probable pose of the sensing device.
However, Pajovic, in an analogous field of endeavor, teaches determining a second motion of the robot based on data from at least one interoceptive sensor, wherein the second motion is determined using a probabilistic distribution of the pose of the robot (Paragraphs 0056-0063; a robot’s future pose may be determined based on a probability distribution of possible poses), and localizing an object on a computer readable map based at least in part on a most probable pose of the sensing device (Paragraphs 0056-0063; obstacles may be determined based on the robot’s pose during traversal; pose probability may also affect the obstacle placement in a map (obstacles located based on likely robot pose)). Therefore, from the teaching of Paovic, it would have been obvious to one of ordinary skill in the art before the effective filing date to have modified, with a reasonable expectation for success, the robotic invention of Liao to include determining a second motion of the robot based on data from at least one interoceptive sensor, wherein the second motion is determined using a probabilistic distribution of the pose of the robot, and localizing an object on a computer readable map based at least in part on a most probable pose of the sensing device, as taught/suggested by Pajovic. The motivation to do so would be to determine motion of a robot based on probability, as well as determine likely obstacle around locations where the robot is likely to be. This can allow for better navigation as the robot can determine obstacles ahead of time and can initiate better avoidance maneuvers.
Regarding claim 17, Liao in view of Pajovi renders obvious all the limitations of claim 16. Liao further discloses updating the pose of the robot based on the determined pose of the sensing device (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; the robot’s pose may be updated based on received data), but fails to disclose updating the probabilistic distribution of the pose of the robot based on the determined pose of the sensing device. However, the obviousness of utilizing a probabilistic distribution is shown in the rationale for claim 16 and would be applicable here as well.
Regarding claim 18, Liao in view of Pajovic renders obvious all the limitations of claim 16. Liao further discloses determining the second motion of the robot comprises: representing the pose of the robot with a plurality of particles, each particle having a position and an orientation (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; determination of the robot’s second motion (see claim 16) may be done through visual odometry which tracks features in sequential images to determine motion; these features within the image may have a given position and location; these features may also be considered a plurality of particles (pixels/pixel clusters)), and updating the positions and orientations of the particles based on the data from the at least one interoceptive sensor (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; feature points in images may be updated by translating/rotating them in order to match up to other images).
Regarding claim 19, Liao in view of Pajovic renders obvious all the limitations of claim 18. Liao further discloses updating the positions and orientations of the particles is performed sequentially (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; transformation of image features through translation/rotation may be done between sequential images).
Regarding claim 20, Liao in view of Pajovic renders obvious all the limitations of claim 16. Liao further discloses capturing the plurality of images comprises capturing a first image at an initial time instance and capturing a second image at a subsequent time instance using the sensing device, wherein the first image and the second image comprise pixels of at least one target object (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; each image taken at sequential times may utilize feature points for determination of poses; feature points may correspond to obstacle sin the environment and are represented as pixels in the image).
Regarding claim 21, Liao in view of Pajovic renders obvious all the limitations of claim 20. Liao further discloses determining the change in pose of the sensing device based on spatial transformations applied to the subsequent image (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; changes in pose of the robot may be determined based on spatial transformations of feature points in an image between two sequential images).
Regarding claim 22, Liao in view of Pajovic renders obvious all the limitations of claim 20. Liao further discloses the sensing device comprises at least one of a stereo camera or a depth sensor (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0107-0109; the images may be captured by utilizing stereo cameras).
Regarding claim 23, Liao in view of Pajovic renders obvious all the limitations of claim 16. Liao further discloses at least one interoceptive sensor (Figs. 1, 7-8, Paragraphs 0046, 0107; visual odometry may be used to determine a second motion of the robot through the environment; visual odometry may be performed using the camera sensors which can be considered ‘interoceptive’ sensors), but fails to disclose the at least one interoceptive sensor comprises at least one of an inertial measurement unit (IMU), such as encoders, gyroscopes, or accelerometers. However, Pajovic teaches the at least one interoceptive sensor comprises at least one of an inertial measurement unit (IMU), such as encoders, gyroscopes, or accelerometers (Paragraph 0106; sensors may include an IMU). Therefore, from the teaching of Pajovic, it would have been obvious to one of ordinary skill in the art before the effective filing date to have further modified, with a reasonable expectation for success, the robotic system of Liao and Pajovic to include the at least one interoceptive sensor comprises at least one of an inertial measurement unit (IMU), such as encoders, gyroscopes, or accelerometers, as taught/suggested by Pajovic. The motivation to do so would be to utilize well known sensors for obtaining data. This can allow for improved accuracy and better motion control of the robot.
Regarding claim 24, Liao in view of Pajovic renders obvious all the limitations of claim 20. Liao further discloses determining an expected measurement of at least one target object at an expected position of the robot based on odometry data (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0088-0089, 0101-0109; the robot may determine an obstacle’s expected location based on visual odometry; the data concerning the obstacle may be considered an expected measurement as it is preliminary in nature), and determining an actual position of the robot based on a difference between the expected measurement and an actual measurement of the at least one target object obtained from the captured images (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0088-0089, 0101-0109; the robot may determine its actual pose based on differences between sequential images and obstacle feature points in those images; this determination may be made based on updated feature point data during obstacle detection (obstacles may be captured in subsequent imaging which results in actual placement of the obstacle in the map).
Regarding claim 26, a portion of the claim limitations are similar to those in claim 16 and are rejected using the same rationale as seen above in claim 16. Additionally, Liao discloses a processor (Figs. 1; system may include a processor).
Regarding claims 27-31 and 33, the claims limitations are similar to those in claims 17-21 and 24, respectively, and are rejected using the same rationale as seen above in claims 17-21 and 24.
Regarding claim 32, the claim limitations are similar to those in claims 22 and 23, and are rejected using the same rationale as seen above in claims 22 and 23.
Claims 25 and 34 are rejected under 35 U.S.C. 103 as being obvious over Liao, in view of Pajovic, and further in view of Ebrahimi Afrouzi et al., US 20200225673 A1, herein referred to as Afrouzi.
Regarding claim 25, Liao in view of Pajovic renders obvious all the limitations of claim 24. Liao further discloses the actual position of the robot is determined by the robot during navigation (Figs. 1, 7-8, Paragraphs 0029-0030, 0046, 0088-0089, 0101-0109; the robot’s actual pose may be determined during navigation), but fails to disclose the actual position of the robot is determined with improved accuracy by accounting for wheel slippage experienced by the robot during navigation. However, Afrouzi, in an analogous field of endeavor, teaches the actual position of the robot is determined with improved accuracy by accounting for wheel slippage experienced by the robot during navigation (Paragraph 0335; wheel slippage may be accounted for when determining the robot’s location). Therefore, from the teaching of Afrouzi, it would have been obvious to one of ordinary skill in the art before the effective filing date to have further modified, with a reasonable expectation for success, the robot system of Liao and Pajovic to include the actual position of the robot is determined with improved accuracy by accounting for wheel slippage experienced by the robot during navigation, as taught/suggested by Afrouzi. The motivation to do so would be to account for a common occurrence in wheeled vehicles when determining its position. This can increase the accuracy of motion control as the robot can mitigate or account for errors during operation.
Regarding claim 34, the claim limitations are similar to those in claim 25 and are rejected using the same rationale as seen above in claim 25.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER ALLEN BUKSA whose telephone number is (571)272-5346. The examiner can normally be reached M-F 7:30 AM-4:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER A BUKSA/Examiner, Art Unit 3658