Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to the Applicant’s arguments
The previous rejection is withdrawn. Applicant’s amendments are entered. Applicant’s remarks are also entered into the record. A new search was made necessitated by the applicant’s amendments.
A new reference was found. A new rejection is made herein.
Applicant’s arguments are now moot in view of the new rejection of the claims.
Claim 1 is amended to recite and Tadi teaches the amendments of “... an edge operator, coupled to the visual sensing module, and configured to perform spatial coordinate conversion on the obstacle distance data, (see paragraph 81-82 where the camera provides the location and geometry of the validated landmark that is a slam analyzer to the ar application) the six-axis acceleration data, and the spatial coordinate data to generate converted data; (see paragraph 51 and 265-269 where the camera is provided and SLAM based coordinates provide a current location of the device and the device position and orientation in a SLAM coordinate system and then the IMU and linear acceleration of the device is taken and then a IMU and Slam coordinate systems are converted and aligned)
.....configured to control the unmanned aerial device to perform obstacle avoidance operation according to the ranging data_and the converted data”. (see paragraph 253-263 where the visual slam may not be accurate and the IMU data and acceleration data together applied to track motion can provide the result using an acceleration data that has both the slam data and the IMU data as a converted single value)”.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of TADI et al. with a reasonable expectation of success since TADI teaches that a collision avoidance can be provided using a SLAM device. The SLAM device can include a camera and a IMU. A SLAM coordinate system and a coordinate system for the IMU accelerometer can be taken and aligned. The IMU data can be added to the SLAM process. See paragraph 97. An acceleration of the device is shown in paragraph 256. This is a single result taking into account the value of the IMU device and the camera sensor to provide a single value. In this way, when the visual SLAM is ceasing to be accurate then the dead reckoning values can take over and be more accurate to estimate a position of the device for collision avoidance. See paragraph 251-261 and 81.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1 and 8-9 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent No.: US 2022/0138568 A1 to Smolyanskiy et al. to NVIDIA™ that was filed in 2020 and in view of Chinese Patent Pub. No.: CN111721291B to Striking Science that was filed in 2020 and in view of U.S. Patent App. Pub. No.: US20200372815A1 to KIKUCHI et al. and in view of United States Patent Application Pub. No.: US 2024/0085211 A1 to Tadi et a. filed in 2021.
PNG
media_image1.png
830
1264
media_image1.png
Greyscale
In regard to claims 1 and 9, Smolyanskiy discloses “...1. An unmanned aerial device, comprising: (see FIG. 1a where inputs from the vehicle can be provided to a training engine and simulator to create a mapping of the acceleration, and velocity and the position of the objects in the coordinate system for future predictions)
a plurality of optical radars, configured to generate a plurality of ranging data; (see paragraph 25 where a first vehicle can provide LIDAR and camera and radar data for actors to the DNN and ground truth data)
a visual sensing module, configured to execute a simultaneous localization and mapping to (see paragraph 64 where the processor can provide a vehicle HD localization on the map including object locations that are static and precise instruction for moving around heading and contours)
generate obstacle distance data, six-axis acceleration data, and spatial coordinate data; acc (see paragraph 68 where the acceleration data can be provided in the state information with the past location of the objects and also the past location data 410 of the actor with the velocity information)”
Smolyanskiy is silent but Striking teaches “...an edge operator, coupled to the visual sensing module, and configured to perform spatial coordinate conversion on the obstacle distance data,
the six-axis acceleration data, and
the spatial coordinate data; and (see claims 1-4 where the gravity acceleration and the bulk acceleration and the Coriolis acceleration from the INS is provided a coordinate transformation from the Carrier coordinate system to the launching coordinate system and see paragraph 1-12; the method also comprises an inertial navigation algorithm arrangement method under the emission g coordinate system: establishing a navigation initialization state, calculating an initial attitude matrix, and initializing local earth parameters; measuring the projection of the angular rate of the gyroscope relative to the inertial space in a carrier b coordinate system and the projection of the specific force of the accelerometer relative to the inertial space in the carrier b coordinate system; passing posture. The state matrix converts the angular rate and the specific force into physical quantities in a launching g coordinate system; in an angular velocity integral loop, correcting a coordinate transformation matrix from a carrier b coordinate system to a transmitting g coordinate system by using the measured angular velocity according to a quaternion integral method, and calculating an attitude angle according to the coordinate transformation matrix; in an acceleration integral loop, the coordinate transformation matrix is used for converting the observed quantity into a transmitting g coordinate system and compensating the gravitational acceleration g and the coriolis acceleration a from the transmitting g coordinate system c And a bulk acceleration ae And obtaining speed and position navigation information through integration.)
The primary reference is silent but KIKUCHI et al. teaches “a flight controller, coupled to the optical radars and the edge operator, and configured to”. (see paragraph 26 where the collision avoidance map includes 1. Maximum speed 2 maximum acceleration and 3. Maximum deceleration for the drone 110)
Smolyanskiy discloses “...control the unmanned aerial device to perform obstacle avoidance operation according to the ranging data, {{the obstacle distance data, the six-axis acceleration data, and the spatial coordinate data}}”. (see paragraph 68 where the acceleration data can be provided in the state information with the past location of the objects and also the past location data 410 of the actor with the velocity information)”.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of Striking Science with a reasonable expectation of success since Striking teaches that acceleration data can be captured for the aircraft from an INS and this coordinate system can be transformed and converted to a second coordinate system. Then a pitch angle, roll angle and can be provided in a launching coordinate system taking into account the vector of the Coriolis acceleration, the gravity and the acceleration of the vehicle into a bulk acceleration vector. An exact speed of the aircraft can then be determined from the INS acceleration for a high accuracy INS inertial navigation. See claim 1-2 and paragraph 1-10.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of KIKUCHI et al. with a reasonable expectation of success since KIKUCHI teaches that a flight controller can create a virtual UAV model to provide a predicted trajectory that can be unsafe and then control the UAV based on the model to prevent any collision and provide an evasive action for the UAV for very safe operation of the future and current position of the UAV. See FIG. 5, blocks 500-590.
Claim 1 is amended to recite and Tadi teaches the amendments of “... an edge operator, coupled to the visual sensing module, and configured to perform spatial coordinate conversion on the obstacle distance data, (see paragraph 81-82 where the camera provides the location and geometry of the validated landmark that is a slam analyzer to the ar application) the six-axis acceleration data, and the spatial coordinate data to generate converted data; (see paragraph 51 and 265-269 where the camera is provided and SLAM based coordinates provide a current location of the device and the device position and orientation in a SLAM coordinate system and then the IMU and linear acceleration of the device is taken and then a IMU and Slam coordinate systems are converted and aligned)
.....configured to control the unmanned aerial device to perform obstacle avoidance operation according to the ranging data_and the converted data”. (see paragraph 253-263 where the visual slam may not be accurate and the IMU data and acceleration data together applied to track motion can provide the result using an acceleration data that has both the slam data and the IMU data as a converted single value)”.
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of TADI et al. with a reasonable expectation of success since TADI teaches that a collision avoidance can be provided using a SLAM device. The SLAM device can include a camera and a IMU. A SLAM coordinate system and a coordinate system for the IMU accelerometer can be taken and aligned. The IMU data can be added to the SLAM process. See paragraph 97. An acceleration of the device is shown in paragraph 256. This is a single result taking into account the value of the IMU device and the camera sensor to provide a single value. In this way, when the visual SLAM is ceasing to be accurate then the dead reckoning values can take over and be more accurate to estimate a position of the device for collision avoidance. See paragraph 251-261 and 81.
Claims 2 and 10 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent No.: US 2022/0138568 A1 to Smolyanskiy et al. to NVIDIA™ that was filed in 2020 and in view of Chinese Patent Pub. No.: CN111721291B to Striking Science that was filed in 2020 and in view of U.S. Patent App. Pub. No.: US20200372815A1 to KIKUCHI et al. and in view of United States Patent Application Pub. No.: US 2022/0350347 A1 to Hagerott and Tadi.
In regard to claim 2 and 10, the primary reference is silent but KIKUCHI et al. teaches “2. The unmanned aerial device according to claim 1, further comprising: a DC motor driver, coupled to the flight controller,” (see paragraph 25)
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of KIKUCHI et al. with a reasonable expectation of success since KIKUCHI teaches that a flight controller can create a virtual UAV model to provide a predicted trajectory that can be unsafe and then control the UAV based on the model to prevent any collision and provide an evasive action for the UAV for very safe operation of the future and current position of the UAV. See FIG. 5, blocks 500-590.
The primary reference discloses obstacle data but is silent and Hagerott teaches “...wherein the flight controller generates position data and posture data according to the six- axis acceleration data and the spatial coordinate data,... wherein the flight controller generates posture control data according to the posture data and
the position prediction data, and the flight controller generates position control data according to
the posture control data and the obstacle distance data,
wherein the flight controller drives the DC motor driver according to the posture control data and the position control data”. (See paragraph 37 and 58)
(see paragraph 59-61 and paragraph 46-61 and 62-80).
PNG
media_image2.png
742
990
media_image2.png
Greyscale
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of HAGEROTT et al. with a reasonable expectation of success since HAGEROTT et al. teaches that a flight controller can create loops that provide a proper attitude and rate command and a proper acceleration and velocity and attitude in blocks 712-715. This can provide a first loop of velocity and acceleration and attitude command and the aircraft can be checked in terms of posture to ensure proper safe operation. See FIG. 7.
PNG
media_image3.png
756
1298
media_image3.png
Greyscale
The primary reference to Smolyanskiy discloses “...and the flight controller executes a nonlinear observer according to the position data to perform trajectory prediction and generate a position prediction data, (see FIG. 1a where inputs from the vehicle can be provided to a training engine and simulator to create a mapping of the acceleration, and velocity and the position of the objects in the coordinate system for future predictions and FIG. 1b where the planning manager can use the acceleration data of the objects to provide a prediction)
Claims 3 and 11 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent No.: US 2022/0138568 A1 to Smolyanskiy et al. to NVIDIA™ that was filed in 2020 and in view of Chinese Patent Pub. No.: CN111721291B to Striking Science that was filed in 2020 and in view of U.S. Patent App. Pub. No.: US20200372815A1 to KIKUCHI et al. and in view of United States Patent Application Pub. No.: US 2022/0350347 A1 to Hagerott and in view of European Patent Pub. No.: EP1990138B1 to Fabien filed in 2008 and Tadi.
In regard to claim 3 and 11, Fabien teaches “...3. The unmanned aerial device according to claim 2, wherein the nonlinear observer is a slide mode observer”. (see claim 1-7).
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of FABIEN et al. with a reasonable expectation of success since FABIEN et al. teaches that an optimization can be provided to a non linear system using a sliding horizon observer optimization. See claims 1-5.
Claims 4-6 and 12-15 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent No.: US 2022/0138568 A1 to Smolyanskiy et al. to NVIDIA™ that was filed in 2020 and in view of Chinese Patent Pub. No.: CN111721291B to Striking Science that was filed in 2020 and in view of U.S. Patent App. Pub. No.: US20200372815A1 to KIKUCHI et al. and Tadi.
PNG
media_image4.png
754
1260
media_image4.png
Greyscale
In regard to claim 4 and 12, Smolyanskiy discloses “...4. The unmanned aerial device according to claim 1, wherein the optical radars comprise a first optical radar and a second optical radar, the first optical radar is configured to sense in a first horizontal direction, the second optical radar is configured to sense in a second horizontal direction, and the first horizontal direction is opposite to the second horizontal direction”. (see paragraph 165 and 191-193 and infrared camera and LIDAR sensors 1172 and 1168 and 1174 and lidar that can capture the acceleration of the objects via radar data in 360 degrees )
In regard to claim 5 and 13, Smolyanskiy discloses “... 5. The unmanned aerial device according to claim 4, wherein the optical radars comprise a third optical radar and a fourth optical radar, the third optical radar is configured to sense in a first vertical direction, the fourth optical radar is configured to sense in a second vertical direction, and the first vertical direction is opposite to the second vertical direction”. (see paragraph 23 where the device can be an aircraft and see paragraph 41, 165 and 191-193 and infrared camera and LIDAR sensors 1172 and 1168 and 1174 and lidar that can capture the acceleration of the objects via radar data in 360 degrees )
In regard to claim 6 and 14, Smolyanskiy discloses “...6. The unmanned aerial device according to claim 1, wherein the unmanned aerial device and
another unmanned aerial device perform coordinated transportation operation, and the obstacle distance data is distance data between the unmanned aerial device and the another unmanned aerial device,
wherein the flight controller controls the unmanned aerial device according to the obstacle distance data, so as to maintain a preset distance between the unmanned aerial device and the
another unmanned aerial device”. (see paragraph 41-48)
Claims 7 and 15 are rejected under 35 U.S.C. sec. 103 as being unpatentable as obvious in view of United States Patent No.: US 2022/0138568 A1 to Smolyanskiy et al. to NVIDIA™ that was filed in 2020 and in view of Chinese Patent Pub. No.: CN111721291B to Striking Science that was filed in 2020 and in view of U.S. Patent App. Pub. No.: US20200372815A1 to KIKUCHI et al. and in view of U.S. Patent No.: US10707961B2 to Turner et al. that was filed in 2017 and Tadi.
In regard to claim 7 and 15, Turner teaches “..7. The unmanned aerial device according to claim 6, wherein the unmanned aerial device and the another unmanned aerial device respectively maintain a same flight height according to a same preset flight height setting”. (see col. 7, lines 1-51 where the spacecraft are maintained on a same orbital plane).
It would have been obvious for one of ordinary skill in the art before the effective filing date of the present disclosure to combine the disclosure of Smolyanskiy with the teachings of TURNER et al. with a reasonable expectation of success since TURNER et al. teaches that two spacecraft can be maintained on the same or different orbital planes.
Smolyanskiy et al. discloses “...8. The unmanned aerial device according to claim 1, wherein the visual sensing module comprises an image sensor and a plurality of infrared sensors. (see paragraph 61 and 62 where the vehicle that can be an aircraft has infrared camera and camera and lidar and radar )
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEAN PAUL CASS whose telephone number is (571)270-1934. The examiner can normally be reached Monday to Friday 7 am to 7 pm; Saturday 10 am to 12 noon.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott A. Browne can be reached at 571-270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEAN PAUL CASS/Primary Examiner, Art Unit 3666