Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This action is in response to the applicant’s filing on February 03, 2026. Claims 1-20 are pending.
Response to Amendment and Arguments
In respond to applicant's arguments based on the filed amendment with respect to 35 U.S.C. 102 rejections of said previous office action have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1, 10, 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claims 1, 10, 16 recites the limitation “the single fused sensor data…”, there is insufficient antecedent basis for this limitation in the claim.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Roy et al. US2019/0107846 (“Roy”) in view of Nackers et al. US2018/0372498 (“Nackers”).
Regarding claim(s) 1, 10, 16. Roy discloses a method comprising:
receiving, at a multi-IMU combination unit (MICU) executed by a processing device, sensor data from a plurality of inertial data sensors of a same sensor type and in a single vehicle (fig. 4, [0119] FIG. 4 shows a first fusion model of the AI application 14. Here, the sensor data 32 from multiple IMUs/sensors 60-1 through 60-N are first combined/fused, and then formatted. Here, the sensor data 32 is fused into a collection of statistical signals/weights that represent the “raw signals” of the sensor data. Typical fusion algorithms/methods are shown in the figure.);
for each inertial data sensor, calibrating and transforming the single fused sensor data using a calibration estimate for the inertial data sensor (0068] The trained neural networks on the UAVs process the input sensor signals to remove noise and if the signal is from one or more sensors perform sensor fusion. Preprocessing and refining the sensor signals ensure that the signal is corrected for noise, fused (if sensor fusion) and calibrated so that the concerned neural networks that use it for predictions, control feedback and interpretations have inputs in the expected format.),
where the calibration estimate is generated using a calibration estimator, the calibration estimate being based on pre-integration methods that provide individual kinematic feedback that is compared to fused kinematic feedback from a main filter ([0106] The sensor fusion strategy can be one or more several approaches depending on the number, type and sensitivity of the sensor and accuracy desired. Sensor fusion can be carried out by the trained neural network 99 when trained. On the other hand, sensor input signals can first filtered (using for example: Kalman, extended Kalman or unscented Kalman filters) then fused either directly or by the trained neural network 9. Sensor fusion can also be first done using a trained neural network with feedback of the output from one or more filters.);
combining the calibrated and transformed sensor data from the plurality of inertial data sensors into a fused output for the same sensor type; sampling the fused output to provide a single inertial data measurement for the plurality of inertial data sensors ([0106]… Sensor fusion can also be first done using a trained neural network with feedback of the output from one or more filters. Sensor signals are each processed by a neural network, fed into one or more filters in a feedback loop and then fused by a trained neural network. Sensor signals can also be decomposed into their PCA (basis) then fused directly or by a trained neural network. Any combination of the above can be used dynamically and autonomously depending on the sensors and whether training the networks or deployed); and
providing the fused kinematic feedback for the calibration estimate, the fused kinematic feedback generated from the sampling of the single fused output ([0121] In more detail, at the AI application 14, an IMU Observation Fusion module 502 receives sensor data 32 from the multiple sensors 60-1 . . . 60-N. The fusion module 502 provides the combined sensor data 32 to an INS Kalman filter module 504. [0122] The Kalman filter module 504 also accepts sensor data 32 from the GPS sensor 60. The filter module 504 also has a system feedback loop 510. [0123] The Kalman filter module 504 has the following inputs. The inputs include the sensor data 32 from the GPS sensor 60, the feedback loop 510, and the combined sensor data 32 from the fusion module 302.).
Roy does not explicitly disclose: providing the fused output to a localization stack of the single vehicle and generating a set of main filter kinematics based on the fused output and providing the set of main filter kinematics estimate to the calibration estimator.
Nackers teaches a system and method that using sensor fusion and For each one of the IMU's on a separate component of the machine, the signals received from the IMU are fused with a separate Kalman filter module by combining an acceleration measurement and an angular rate of motion measurement from the IMU to estimate an output joint angle for the component of the machine on which the IMU is mounted. Additionally, the fused output to a localization stack of the single vehicle and generating a set of main filter kinematics based on the fused output and providing the set of main filter kinematics estimate to the calibration estimator the fused output to a localization stack of the single vehicle and generating a set of main filter kinematics based on the fused output and providing the set of main filter kinematics estimate to the calibration estimator (abstract, para. 23-29, FIG. 2, the output joint angles that have been fused at the machine level by the Kalman filter 240 may be received by a kinematic library module 260. The kinematics library module 260 may be configured to receive the output joint angles from the Kalman filter(s) 240 and dimensional design information specific to the machine 10 from a dimensional design information database 250, and solve for a frame rotation and position at each component or point of interest on the machine. The frame can have offsets applied to the information derived from the IMU's in order to solve for any particular point on the machine, and all of the updated position information can be provided to a machine state control system 50, which may be associated with or programmed as part of the machine ECM.).
It would have been obvious to one of ordinary skills in the art before the effective filing date of the claimed invention to modify the system and method of Roy by incorporating the applied teaching of Nackers to better estimated to reduce propagating error and improve orientation readings for the vehicle and one of ordinary skill before the effective filing date of the claimed invention would have recognized that the results of the combination would have been predictable.
Regarding claim(s) 2. Roy discloses wherein the MICU is hosted by an autonomous vehicle (AV) comprising the plurality of inertial data sensors ([0005] A vehicular system that provides autonomous control and management of vehicles such as unmanned aerial vehicles (UAVs) is proposed. The system can control, possibly in stealth, hundreds or possibly thousands of UAVs maneuvering in groups and optimally use resources of the UAVs for solving a given problem (e.g., dealing with ground cover based on visibility and weather). The system can further enable operation of the UAVs in differential GPS and GPS-denied environments, in examples.).
Regarding claim(s) 3, 11, 17. Roy discloses wherein the inertial data sensors comprise one or more of an inertial measurement unit (IMU), a coordinate measuring machine (CMM), or a high-resolution wheel encoder (HRWE) (fig. 4).
Regarding claim(s) 4, 12, 17. Roy discloses wherein combining the calibrated and transformed sensor data further comprises fitting an N-degree polynomial using the calibrated and transformed sensor data ([0061] As the groups of UAVs and number of UAVs within the groups increase over time, the use of these local controller UAVs promotes scalability. As a result, the cloud-based architecture of the distributed control system 16, combined with usage of the multiple local controller UAVs within the groups/swarms 702, avoids NP complete problems (polynomial space and time) that would otherwise occur if the controller 44 were to communicate with each of the UAVs directly, promotes scalability to hundreds or possibly thousands of UAVs, and possibly hundreds or thousands of groups of the UAVs.).
Regarding claim(s) 5, 13, 19. Roy discloses wherein the N-degree polynomial is fitted using tuning parameters to control smoothing and fit of the N-degree polynomial, and wherein the tuning parameters are for at least one of a length of time window of the sensor data or a degree of the N-degree polynomial ([0061] As the groups of UAVs and number of UAVs within the groups increase over time, the use of these local controller UAVs promotes scalability. As a result, the cloud-based architecture of the distributed control system 16, combined with usage of the multiple local controller UAVs within the groups/swarms 702, avoids NP complete problems (polynomial space and time) that would otherwise occur if the controller 44 were to communicate with each of the UAVs directly, promotes scalability to hundreds or possibly thousands of UAVs, and possibly hundreds or thousands of groups of the UAVs.).
Regarding claim(s) 6, 20. Roy discloses tagging at least one of the inertial data sensors as a faulty inertial data sensor responsive to the faulty inertial data sensor generating estimates that trigger a fault condition indicating that the estimates do not agree with the fit of the N-degree polynomial ([0107] The AI application 14 performs learning tasks, as well as providing output to the controller 44 for various operations at the UAVs 102. These operations include: guidance and navigation, sensor calibration/recalibration, communication type and protocol to use, interface(s) to the local controller UAV 102 and system controller 44, swarm formation parameters and allowable errors and tolerances, in examples. In this way, each UAV has information that includes any limits placed upon its flight path/destinations from the simulations.).
Regarding claim(s) 7. Roy discloses wherein the faulty inertial data sensor is removed from operation and remaining inertial data sensors of the plurality of inertial data sensors continue to operate ([0107] The AI application 14 performs learning tasks, as well as providing output to the controller 44 for various operations at the UAVs 102. These operations include: guidance and navigation, sensor calibration/recalibration, communication type and protocol to use, interface(s) to the local controller UAV 102 and system controller 44, swarm formation parameters and allowable errors and tolerances, in examples. In this way, each UAV has information that includes any limits placed upon its flight path/destinations from the simulations.)..
Regarding claim(s) 8. Roy discloses wherein a tunable parameter of the MICU enables the fused kinematic feedback to be turned on or off for purposes of generating the calibration estimate. ([0068] The trained neural networks on the UAVs process the input sensor signals to remove noise and if the signal is from one or more sensors perform sensor fusion. Preprocessing and refining the sensor signals ensure that the signal is corrected for noise, fused (if sensor fusion) and calibrated so that the concerned neural networks that use it for predictions, control feedback and interpretations have inputs in the expected format. [0069] In some cases, QoS state functions values of the given UAV may result in the UAV not processing the signal (due to low available power for example or using its processing resources for more important control work as another example) and the raw signal will be sent directly to the controller, for processing and feedback to the local UAV.)
Regarding claim(s) 9. Roy discloses a consumer of the fused output comprises a localization stack of an autonomous vehicle (AV) ([0005] A vehicular system that provides autonomous control and management of vehicles such as unmanned aerial vehicles (UAVs) is proposed. The system can control, possibly in stealth, hundreds or possibly thousands of UAVs maneuvering in groups and optimally use resources of the UAVs for solving a given problem (e.g., dealing with ground cover based on visibility and weather). The system can further enable operation of the UAVs in differential GPS and GPS-denied environments, in examples.)
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRUC M DO whose telephone number is (571)270-5962. The examiner can normally be reached on 9AM-6PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramón Mercado, Ph.D. can be reached on (571) 270-5744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TRUC M DO/Primary Examiner, Art Unit 3658