DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
101 Rejection
Based on applicant’s amendments, the previously set forth 101 rejection has been overcome. It is the position of the Office that calibrating the sensor based on the result of the abstract idea implements the abstract idea into a practical application.
102 Rejection
Applicant argues Douillard et al. fails to teach that the semantic label is used for indicating a corresponding relationship between the environment and calibration of the sensor. However, insofar as how the claimed “identification result” is structurally recited, the indication of an object within the environment of the AV device reads on the claim language. Under the broadest reasonable interpretation of the claim language, applicant has not defined the environment to distinguish the Office’s position that an object near an AV 130 reads as the claimed environment defined by Oxford Dictionary as, the surroundings or conditions in which a person/object, animal, or plant lives or operates.
The examiner would also like to point out that the insofar as how the claimed “identification result” is structurally defined, the device using sensors to identify an object in the environment, hence an identification result of the environment. Douillard et al. then details using the identified object in the environment for indicating a corresponding relationship, i.e. a relationship between the captured object and know data of that object, i.e. alignment. The examiner would like to point out there is currently no structural definition of what applicant considers to be a “corresponding relationship”. Therefore, a misalignment of the object to known information about the object, creating a fuzzy image, reads on a corresponding relationship, i.e. between the object in the environment and known data. Douillard et al. then goes on to disclose in para. [0150] if it is determined the corresponding relationship indicates a fuzzy image, a requirement for calibration is present and calibration is performed, i.e. lens adjustment of the sensor so to capture the object such the relationship between the object and the data is aligned.
The argument that map data files are merely data describing the environment and the environment, see page 13 of Applicant’s filed remark, is unclear to the examiner; as the claimed invention collects data from the environment. Therefore the data being used is data describing the environment which was data obtained by a sensor. It is the Office’s position, Douillard et al. teaches extracting object feature data from the collected data and generating an identification result that represents if the image data of the object aligns with known image data of the object. If there exists a misalignment, i.e. fuzzy image data, a requirement for calibration is present and calibration occurs. The examiner is unsure how the argued claim language distinguishes itself from the prior art, as applicant relies on data collected from the environment.
The examiner suggests further defining one or all four elements to overcome the current art rejection: the environment, identification result, the corresponding relationship or the requirement.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-18 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Douillard et al. (2017/0124781).
With respect to claim 9, Douillard et al. teaches in Fig. 4 an electronic device (400), comprising at least one processor (2310), and a memory (as indirectly taught in [0106] in communicative connection with the at least one processor and storing therein an instruction configured to be executed by the at least one processor, wherein the at least one processor (2310) is configured to execute the instruction to implement following steps: obtaining data collected by a sensor (670) of a device (AV; 130) from an environment (as seen in Fig. 1, where, i.e. the physical environment of the AV, reads on a defined “environment”) where the device (AV; 130) is located; extracting feature information from the collected data (as Douillard teaches processor 2310 is configured to extract ground plane data and/or to segment portions of an image to distinguish objects from each other and from static imagery; [0106]); and generating an identification result (i.e. a semantic label) of the environment in accordance with the feature information (as classified; [0106]), the identification result (i.e. the semantic label) being used for indicating a corresponding relationship between the environment (as sensed using the various sensors 670, for example cameras and LIDAR) and calibration of the sensor (as the identification result can be aligned with map data files describing the environment so to calibrate the sensors jointly and against each other; [0149]); wherein the corresponding relationship indicated by the identification result is whether the environment meets a requirement of the calibration of the sensor (as Douillard teaches the system continuously detecting a need for calibration based on the relationship between the data obtained by the sensors and segmented portions of the image that indicated a requirement to correct of drift; [0150], insofar as how a “relationship” and “requirement” are structurally defined); and in response to the corresponding relationship indicated by the identification result (i.e. a determined drift, i.e. fuzzy image, between the segmented images and known data) being that the environment meets the requirement of the calibration of the sensor (i.e. if the images are fuzzy, the relationship between obtained object data within the environment to known data indicates a requirement for calibration), performing a calibration operation (lens adjustment; [0150]) on the sensor (i.e. as the calibration, insofar as how it is structurally defined, occurs by adjusting the lens of the sensor based on the corresponding relationship of obtained data and known data; [0150]).
The method steps of claim 1 are performed during the operation of the rejected structure of claim 9.
With respect to claim 17, Douillard et al. teaches a non-transitory computer-readable storage medium storing therein a computer instruction [0125], wherein the computer instruction [0135] is configured to be executed by a computer [0128] to implement the device environment identification method according to rejected claim 1 during the operation of the rejected device of claim 9.
With respect to claim 10, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the at least one processor (2310) is configured to execute the instruction (3144) to further implement: filtering the collected data to obtain filtered data (using a truncated sign distance function; [0119]), wherein the extracting the feature information from the collected data comprises: extracting the feature information from the filtered data (as Fig. 31 describes taking the collected data, extracting from that data features of the environment in a 3D space, and using the TSDF to fuse sensor data and map data through the limiting the range of distances considered in the environment).
The method steps of claim 2 are performed during the operation of the rejected structure of claim 10.
With respect to claim 11, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the collected data (via the various sensors that include cameras and LIDAR) comprises point cloud data (3D point cloud data based on laser returns 3608; [0141]), and the point cloud data (3608) comprises position coordinate information of each data point in a data point set Defining that 3D space and its features), wherein the filtering the collected data to obtain the filtered data (using the disclosed TSDF) comprises: filtering the data point set in accordance with the position coordinate information to obtain the filtered data (as the TSDF logic, cloud point data in the 3D space, and the known heights of the LIDAR sensors on the vehicle, the taught filtering function is capable of filtering out ranges based on the 3D coordinates and known sensor heights; [0141]).
The method steps of claim 3 are performed during the operation of the rejected structure of claim 11.
With respect to claim 12, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the filtering comprises at least one of: height filtering (using the TSFD and known sensor heights), comprising deleting each data point in the data point set whose height coordinate value is less than or equal to a third predetermined threshold (as defined by the TSTD), the height coordinate value being comprised in the position coordinate information (part of the point cloud data collected by the sensors).
The method steps of claim 4 are performed during the operation of the rejected structure of claim 12.
With respect to claim 13, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the generating the identification result of the environment (via the semantic label; [0106]) in accordance with the feature information (i.e. the features from the environment) comprises: identifying a geometrical feature (i.e. an object class; [0106]) contained in the collected data (as collected by the various sensors) in accordance with the feature information (i.e. the features extracted from the data depicted the environment); and generating the identification result of the environment in accordance with the geometrical feature contained in the collected data (as the sematic labels are in accordance with the object class; [0106]), wherein in case that the geometrical feature (i.e. the object class; [0106]) contained in the collected data meets a predetermined condition (for example, height of a sensor or distance to an object; [0119]) for the calibration of the sensor (i.e. a set forth by a truncated sign distance function), the corresponding relationship indicated by the identification result (via the liable) is that the environment (as sensed) meets a requirement of the calibration of the sensor (i.e. distance, for example), and in case that the geometrical feature (of that object) contained in the collected data does not meet the predetermined condition for the calibration of the sensor (i.e. is outside a distance range defined by the TSDF), the corresponding relationship indicated by the identification result (i.e. label) is that the environment (as sensed) does not meet the requirement of the calibration of the sensor (as the geometrical objected, as labeled, is outside the defined range and is not meet the requirement of the calibration of that sensor).
The method steps of claim 5 are performed during the operation of the rejected structure of claim 13.
With respect to claim 14, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the collected data (via the sensors) comprises a plurality of data regions (i.e. distinguishable regions 2552), each data region comprises a plurality of data points (clusters or groups of a point cloud; [0108]), and the feature information (i.e. the information used to classy an object) comprises feature information of the data points in the plurality of data regions (known point clusters that define an object, etc.), wherein the identifying the geometrical feature (of that object so to classify the object) contained in the collected data (via the sensors) in accordance with the feature information (i.e. information that is used to classify the object) comprises: identifying the geometrical feature contained in the collected data in accordance with the feature information of the data points in the plurality of data regions (so as to define the object from the cloud points, so to classify those points that define the geometric feature of the object, thereby allowing classification of that object), wherein with respect to each data region (i.e. cloud point), in case that a similarity feature of the feature information of the plurality of data points in the data region meets a predetermined similarity condition (i.e. as Douillard et al. teaches using cloud point data of 2D and 3D data against map data to identify, i.e. classify, features in the environment), there is the geometrical feature in the data region (i.e. the distinguishable region), and in case that the similarity feature of the feature information of the plurality of data points in the data region (i.e. the data cloud points of the feature information) does not meet the predetermined similarity condition (i.e. no match), there is no geometrical feature in the data region (the system determines that there is no geometrical featured in the data region meeting map data of various features).
The method steps of claim 6 are performed during the operation of the rejected structure of claim 14.
With respect to claim 15, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein that the similarity feature (i.e. matching feature) of the feature information (determined object based on the collected data points) of the plurality of data points in the data region (i.e. cloud points) meets the predetermined similarity condition (i.e. there is a match between the data and the mapped data) comprises: similarity of the feature information of the plurality of data points in the data region is greater than or equal to a first predetermined similarity threshold (as indirectly taught; [0069], which discloses determining a match between collected data and mapped data, and if a defined threshold is not achieved to indicate a match, then no match is determined), wherein that the similarity feature of the feature information of the plurality of data points in the data region does not meet the predetermined similarity condition (i.e. there is no match existing between the data and the mapped data) comprises: similarity of the feature information (i.e. a matching feature) of the plurality of data points in the data region (i.e. the cloud points) is less than or equal to a third predetermined similarity threshold (as indirectly taught; [0069], which discloses determining a match between collected data and mapped data, and if a defined threshold is not achieved to indicate a match, then no match is determined; the system of Douillard et al. is capable of determining many difference objects or features within an environment, thereby indirectly teaches many difference thresholds used to determine an similarity feature).
The method steps of claim 7 are performed during the operation of the rejected structure of claim 15.
With respect to claims 8 and 16, Douillard et al. teaches in Fig. 4 the electronic device (400) wherein the device comprises an autonomous vehicle (AV 130).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Abaru et al. (2019/0204425) which teaches calibrating sensors for an autonomous vehicle using image data.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW G MARINI whose telephone number is (571)272-2676. The examiner can normally be reached Monday-Friday 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Meier can be reached at 571-272-2149. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MATTHEW G MARINI/ Primary Examiner, Art Unit 2853