DETAILED ACTION
Response to Amendment
This action is in response to the remark entered on December 16th, 2025.
Claims 1, 2, 4 – 6, 9, 21 – 23, 25 – 27, 29, 31 and 33 – 38 are pending in current application.
Claims 1, 2, 5, 6, 22, 23, 27, 31, 35 and 37 are amended.
Claims 38 is newly added.
Claims 3, 7, 8 , 10 – 20, 24, 28, 30 and 32 are cancelled.
Claims 33 – 37 are objected.
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Drawings
The drawings are objected to under 37 CFR 1.83(a). The drawings must show every feature of the invention specified in the claims. Therefore, applicant newly recited claim limitation regarding “a captured sampling rate”, “a data processing sampling rate” and “a different data processing sampling rate” and “a different rate” must be shown and entered in written description or the feature(s) canceled from the claim(s). No new matter should be entered.
Please also see 37 CFR 1.83(a) for the drawing in a nonprovisional application must show every feature of the invention specified in the claims. However, conventional features disclosed in the description and claims, where their detailed illustration is not essential for a proper understanding of the invention, should be illustrated in the drawing in the form of a graphical drawing symbol or a labeled representation (e.g., a labeled rectangular box).
Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 2, 4 – 6, 9, 21 – 23, 25 – 27, 29, 31 and 38 are rejected under 35 U.S.C. 103 as being unpatentable over Ebrahimi Afrouzi et al (US Pat Pub No. 2020/0409376, hereinafter Afrouzi et al) in view of Roy et al (US Pat Pub No.2019/0107846).
Regarding claims 1, 22 and 31, Afrouzi et al shows a method and system comprising a computer (See at least Para 0536 for computer act as a terminal over wifi network transmitting file/log) and storage devices stored instructions operable (See at least Para 0537 for non-volatile configuration stored in the flash or NVRAM store computer program code) to cause the computers to perform operations when executed by the computer comprising (See at least Para -153 for robot with microcontroller with computer code execution; also on at least Para 0209 for robot as outdoor environment mapping for autonomous drone):
obtaining sensor data from a robot captured at a captured sampling rate (See at least Para 0153 for robot equipped with sensors with collection of environment sensor data on Para 0344; See at least Lidar sensor sampling rates of 4 - 10 Hz on Para 0259),
indicating a traversed route at a property (See at least Para 0244 for user defined boundary zone including sub areas or perimeter openings also on figure 51 with boundary zone 5500; also on Para 0245 for define a path of the robot for robot traversing);
determining data frame of the sensor data (Para 0398 for data from sensor provided with label with lookup table as data frame and Para 0179 for data with structure; also on Para 0155 for object captured/sensor data placed in object dictionary as a data frame for classification purpose);
selecting a subset of the sensor data for use during a localization process ( at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data at different time; at least Para 0193 and 0248 for subset of data used for map building);
using the sampling rates determined for sampling the sensor data (See at least Lidar sensor sampling rates of 4 - 10 Hz on Para 0259, camera imager of 0196 and 0415 for sampling rate changing/selecting due to obstacle density for robot camera captured image distorted and therefore adjusting frame rate of data collection as subset sensor data selection based upon sampling rates and TSOP Infrared sensor of 0555 with sampling rates at 38Khz operating frequency where each has its own sampling determined for sampling data);
and from a plurality of sensor data subsets (See at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data at different time; also on Para 0193 and 0248 for subset of data used for map building);
each of which include data for navigating using localization (See at least Para 0181 and 0182 for map generation with new sensor data collected and provides robot navigation using the map),
the subset of the sensor data includes data frames of the sensor data obtained along the route at the property (also on Para 0398 for data from sensor provided with label with lookup table as data frame; also on Para 0157 for object captured/sensor data recognition process placing the sensor data captured into database to train dictionary for classification purpose; also at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data; also on Para 0193 and 0248 for subset of data used for map building);
storing the selected subset of the sensor data for use as localization exemplars during a localization process by a second robot in memory ( See at least Para 0398 for data from sensor provided with label with lookup table as data frame; also on Para 0157 for object captured/sensor data recognition process placing the unrecognized sensor data captured as selected into database to train dictionary for classification purpose; also on at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data; also on Para 0193 and 0248 for subset of data used for map building;
also at least Para 0447 for robot transmit map and trajectory from the first robot to the second robot; see also Para 0575 for map stored in the memory of the robot; Para 0190 shows the sensor of the robot collects subset of points 1901, at three different times 2000,2001 and 2002 as robot moves within the environment where the robot observe features in each subset and gains clarify in differential particularities, in this instant case, data of depth for point A and B from different position and time where only one subset of the data is chosen to create localization map, Para 0193, by distinguish particular data difference as different time/depth as the subset of the sensor for SLAM map building/exemplar for robot localization);
however, Afrouzi does not further discuss the varying/different sampling rate for each data processing for robotic sensor data.
Roy et al further discuss a data processing sampling rate for sampling data frames of the sensor data (See at least Para 0492 for the voice/video sensor data from UAV drone is preprocessed before entering model, the processing rate is two times the Nyquist limit/Nyquist frequency sampled to avoiding alias, Para 0103; also on Para 0018 for Principal Component Analysis, PCA, for sensor data includes data frames on Para 0081 and 0082 for dataset);
sensor data subset each for a different data processing rate (See at least Para 0060 – 0067 for UAV drone robot formation with each robot as subset, Para 0060, where each robot subset obtains new sensor data in further process/refined individually upon normalizing, Nyquist frequency sampling discussed above along with further data fusion, on Para 0067),
localization using the determined data processing sampling rate (See at least Para 0107 for UAV robot navigation/localization based sensor data using determined data processing rate for new sensor data added with Nyquist sampling rate discussed above).
It would have been obvious for one of ordinary skill in the art, at the time of filing, to provide varying data sampling rates in the dynamic environment of Roy at the time of filing, for the multiple sensor data obtained in Afrouzi, since varying data processing as taught by Roy would eliminate aliasing for the multiple sensor data processing fusion desired by Afrouzi yet not discussed as this known frequency sampling technique of Roy would improve similar robot sensor data manipulation for localization in Afrouzi, as desired by the both robot device of Roy and Afrouzi.
Regarding claims 2, 4, 23 and 25, Afrouzi et al shows detecting first feature in the sensor data (See at least Para 0155 for image data to perform image analysis and object identification).
Roy et al further shows first feature in the sensor data (See at least para 0079 also discussed image data obtained from camera sensor),
determining the data processing sampling rate that is different data than and based on the first feature using detected first features (See at least Para 0492 for the voice/video sensor data from UAV drone is preprocessed before entering model, the processing rate is half of the Nyquist limit/Nyquist frequency sampled to avoiding alias, also on Para 0018 for Principal Component Analysis, PCA, for sensor data includes data frames on Para 0081 and 0082 for dataset ;Para 0107 for new sensor data added as different data);
selecting part of the subset of the sensor data using data processing sapling rate (See at least Para 0492 for the voice/video sensor data from UAV drone is selected/preprocessed before entering model, the processing rate is the half of Nyquist limit/Nyquist frequency sampled to avoiding alias, also on Para 0018 for Principal Component Analysis, PCA, for sensor data includes data frames on Para 0081 and 0082 for dataset).
It would have been obvious for one of ordinary skill in the art, at the time of filing, to provide varying data sampling rates in the dynamic environment of Roy at the time of filing, for the multiple sensor data obtained in Afrouzi, since varying data processing as taught by Roy would eliminate aliasing for the multiple sensor data processing fusion desired by Afrouzi yet not discussed as this known frequency sampling technique of Roy would improve similar robot sensor data manipulation for localization in Afrouzi, as desired by the both robot device of Roy and Afrouzi.
Regarding claims 5 and 26, Afrouzi et al shows determining sampling rate for a portion of the route (See at least Para 0494 for sensor signal captured during the route converted with sampling rate);
selecting using the sampling rate as at least part of the subset of the sensor data (See at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data; also on Para 0193 and 0248 for subset of data used for map building); a subset of sensor data images obtained along the portion of the route at the sampling rate (See at least Para 0189 and 0190 for generating map as selecting using a sub set of points 1901 as sensor collected data; also on Para 0193 and 0248 for subset of data used for map building);
however, Afrouzi et al does not further discussed the data processing sampling rate.
Roy et al further shows determining the data processing sampling rate (See at least Para 0492 for the voice/video sensor data from UAV drone is preprocessed before entering model, the processing rate is two times the Nyquist limit/Nyquist frequency sampled to avoiding alias).
It would have been obvious for one of ordinary skill in the art, at the time of filing, to provide varying data sampling rates in the dynamic environment of Roy at the time of filing, for the multiple sensor data obtained in Afrouzi, since varying data processing as taught by Roy would eliminate aliasing for the multiple sensor data processing fusion desired by Afrouzi yet not discussed as this known frequency sampling technique of Roy would improve similar robot sensor data manipulation for localization in Afrouzi, as desired by the both robot device of Roy and Afrouzi.
Regarding claims 6 and 27, Afrouzi et al shows sampling rates indicate a number of data frames subsets to be selected as part of the subset of sensor data over a period of time (See at least Para 0189 for an example of spatial model generated based upon subset of data; See at least Lidar sensor sampling rates of 4 - 10 Hz on Para 0259, camera imager of 0196 and 0415 for sampling rate changing/selecting due to obstacle density for captured image distorted and therefore adjusting/determining frame rate of data collection as subset sensor data selection based upon sampling rates ; also on Para 0196 for data frame of camera; also on Para 0398 for data from sensor provided with label with lookup table as data frame and Para 0179 for data with structure; also on Para 0155 for object captured/sensor data placed in object dictionary as a data frame for classification purpose) ; or over a number of data frames captured in the sensor data (See at least Para 0190 for data frame in time 2000,2001,2002 as three data frame captured);
however, Afrouzi et al does not further discussed the data processing sampling rate.
Roy et al further shows determining the data processing sampling rate (See at least Para 0492 for the voice/video sensor data from UAV drone is preprocessed before entering model, the processing rate is two times the Nyquist limit/Nyquist frequency sampled to avoiding alias).
It would have been obvious for one of ordinary skill in the art, at the time of filing, to provide varying data sampling rates in the dynamic environment of Roy at the time of filing, for the multiple sensor data obtained in Afrouzi, since varying data processing as taught by Roy would eliminate aliasing for the multiple sensor data processing fusion desired by Afrouzi yet not discussed as this known frequency sampling technique of Roy would improve similar robot sensor data manipulation for localization in Afrouzi, as desired by the both robot device of Roy and Afrouzi.
Regarding claims 9 and 29, Afrouzi et al shows obtaining location data from the robot indicating an approximate location of the robot (See at least Para 0311 for approximated location determined),
storing the selected subset of the sensor data for use as localization exemplars comprises storing the location data from the robot (See at least Para 0311 for creating example map as localization exemplar to be stored based on subset).
Regarding claim 21, Afrouzi et al shows obtaining an exemplar request from the robot or another robot at the property (See at least Para 0447 for robot transmit map and trajectory from the first robot to the second robot based upon robot command/request originated from user command on Para 0277; see also Para 0575 for map stored in the memory of the robot);
selecting one or more data frames of the subset of the sensor data using the exemplar request (See at least Para 0277 for user command to the second robot with second robot response on Para 0477 for appropriate map associated with label for unique tags in each sub area);
providing the selected localization exemplar to the requesting robot (See at least Para 0447 for robot transmit map and trajectory from the first robot to the second robot based upon robot command/request).
Regarding claim 38, Afrouzi et al shows the captured sampling rate comprises a different rate other than processing rate (See at least Para 0259 for Lidar sensor at capture sampling rates of 4 - 10 Hz on Para 0259 and TSOP Infrared sensor of 0555 with sampling rates at 38Khz as a different rate with respect to lidar sensor; also on Para 0494 for processing rate from analog signal to digital signal processing 8000 samples per second );
Roy et al shows rate other than the data processing rate (See at least Para 0102 and 0103 for each time new sensor data provided, Para 0067, new data sampling rate, rate other than the old data processing rate, is checked with Nyquist frequency bound).
It would have been obvious for one of ordinary skill in the art, at the time of filing, to provide varying data sampling rates in the dynamic environment of Roy at the time of filing, for the multiple sensor data obtained in Afrouzi, since varying data processing as taught by Roy would eliminate aliasing for the multiple sensor data processing fusion desired by Afrouzi yet not discussed as this known frequency sampling technique of Roy would improve similar robot sensor data manipulation for localization in Afrouzi, as desired by the both robot device of Roy and Afrouzi.
Response to argument
In response to applicant’s remark that Afrouzi does not shows applicant newly recited claim limitation regarding, “sensor data…a captured sampling rate…a data processing sampling rate…sampling data frame…for sensor data”; however, applicant’s attention is directed to Page 5 above where applicant newly recited claim limitation is now addressed under Afrouzi in view of Roy et al.
In this instant case, Roy shows the UAV sensor data captured with different data processing sampling rate utilizing Nyquist frequency limit for each uav as a subset for UAV/drone robot formation, creating data subset with further data fusion for navigation localization purpose.
Claim Objections
Claims 33 – 37 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abby Lin can be reached on 5712703976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Ian Jen/Primary Examiner, Art Unit 3664