DETAILED ACTION
Response to Amendment
This office action regarding application number 18/659,171, filed May 9, 2024, is in response to the applicants arguments and amendments filed December 16, 2025. Claim 9 has been cancelled. New claims 12-13 have been added. Claims 1-8 and 10-11 have been amended. Claims 1-11 are currently pending and are addressed below.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
The applicants arguments and amendments to the application have overcome some of the objections and rejections previously set forth in the Non-Final action mailed September 16, 2025. Claim 9 has been cancelled and therefore all associated objections and rejections are withdrawn. Applicants amendments to claim 10 have rendered the previous interpretation under 35 USC 112(f) moot through the removal of the interpreted language, therefore the claim interpretation is withdrawn. Applicants amendments to claims 1 and 10-11 have been deemed sufficient to overcome the previous 35 USC 101 rejections through the inclusion of “controlling a movement or operation of the machine based on,” therefore the rejections are withdrawn. Applicants amendments to claims 1 and 10-11 have been deemed sufficient to overcome the previous 35 USC 102 rejections through the inclusion of at least “estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value … determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources” therefore the rejections are withdrawn. However as this changes the scope of the claims, new art rejections have been made based on the changes in scope. Additionally the applicants arguments have been fully considered but are not fully persuasive for the reasons seen below.
On pages 8-9 the applicant argues “The claims differ from Tomioka in several fundamental respects. First, the claims require that the "spatially resolved map" comprises predicted values that characterize raw sensor data expected to be returned by the sensor at respective locations. Tomioka' s map does not contain predicted raw sensor data. Rather, it stores higher-level features such as 3D point groups, color information, or occupancy values that represent pre-extracted environmental attributes. These are not "predicted values" of sensor data; they are stored representations of the environment itself. In contrast, the claimed invention operates in the sensor domain by predicting what sensor readings (for example, raw range intensities, reflectance values, or pixel patterns) are expected if the environment matches the map. Thus, Tomioka's system compares environmental features; the claims, by contrast, recite a comparison of the sensor signals, enabling a continuous sensor-domain consistency analysis rather than a discrete feature-matching operation.”, the examiner respectfully disagrees.
MPEP 2142-2144 discusses the requirements for a case of obviousness using 35 USC 103 and provides examples of such cases. MPEP 2111 discusses Broadest Reasonable Interpretation and the interpretation of claims.
As discussed in the rejections below Tomioka teaches reading sensor data (See Figure 2, item S12 "Receive input of sensor information"); and map data (Paragraph [0027], "The map information according to the present exemplary embodiment includes key frame group information including one or a plurality of pieces of key frame information indicating a specific object in a real space. Using the map information, the position and orientation estimation unit 13 can compare features of feature points obtained from the sensor information with features of feature points measured in advance, and measure the position and orientation of the sensor 10 in a map coordinate system," here the map data contains spatial information regarding objects in the environment and their expected locations); and comparing the two information pieces. Tomioka further teaches that the map data includes predicted values that characterize raw sensor data expected to be returned by the sensor (Paragraph [0035], “For example, the map (map information) is generated by combining pieces of sensor information obtained by the sensor 10 mounted on the vehicle while operating the vehicle by a remote controller or manually”) here the map data is generated by combining pieces of sensor data and is therefore comprised of data that is expected to be returned by the sensor.
Therefore the combination of Tomioka and Chandler teaches the map data comprises predicted values that characterize raw sensor data expected to be returned by the sensor.
Applicant’s arguments with respect to page 9 and the second and third arguments, specifically the newly amended language “plurality of potential error sources … based on both the difference value and the estimated contributions of the potential error sources”, have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
On page 9 the applicant argues “Fourth, the claims explicitly recite controlling the machine's movement or operation based on the determination and probability. Tomioka never uses its comparison result to control vehicle movement. Tomioka's system is instead directed to gradually updating a map over time as long as detected differences are significant enough, in order to maintain map accuracy.”, the examiner respectfully disagrees.
MPEP 2142-2144 discusses the requirements for a case of obviousness using 35 USC 103 and provides examples of such cases. MPEP 2111 discusses Broadest Reasonable Interpretation and the interpretation of claims.
As discussed in the rejections below Tomioka teaches controlling a movement or operation of the machine based on (a) whether the predefined environmental characterization is determined to be fulfilled and (b) the determined probability of such fulfilment (Paragraph [0064], “Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information”) here the system is outputting an updated occupancy grid; here the system is outputting an occupancy grid, indicating a property, occupied or unoccupied, with an associated probability from 0 to 1 (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map"); here the system is then controlling the vehicle based on the determined updated 3D positions of objects and feature points and the grid cell occupancy map with an associated probability (Paragraph [0044], “In other words, the vehicle is moved based on the map information indicating the position and orientation of the vehicle at the target point (or route), and the position and orientation of the sensor 10 estimated by the position and orientation estimation unit 13. To calculate the control values, the control value determination unit 17 calculates all possible variations of the control values that reduce the Euclidean distance between the position and orientation of the vehicle and the position and orientation at the target point in the map information as control value candidates”).
Therefore the combination of Tomioka and Chandler teaches controlling the machine's movement or operation based on the determination and probability, in this case the determination is based on the probability of occupancy and the system is therefore controlling the vehicle based on both.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 1-8 and 10-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Tomioka (US-20210190535) in view of Chandler (US-20240338916).
Regarding claim 1, Tomioka teaches a computer-implemented method for a machine equipped with at least one sensor, the method comprising the following steps (Paragraph [0003], "an information processing apparatus includes an obtaining unit configured to obtain a second position of a map feature point included in map information based on the map information and information obtained from a sensor mounted on a vehicle")
reading actual sensor data recorded by the at least one sensor (See Figure 2, item S12 "Receive input of sensor information")
ascertaining a location and an orientation associated with the actual sensor data (See Figure 2, item S13 "Estimate a position and orientation")
reading a spatially resolved map comprising, for respective locations, predicted values that characterize raw sensor data expected to be returned by the at least one sensor at the respective location (Paragraph [0027], "The map information according to the present exemplary embodiment includes key frame group information including one or a plurality of pieces of key frame information indicating a specific object in a real space. Using the map information, the position and orientation estimation unit 13 can compare features of feature points obtained from the sensor information with features of feature points measured in advance, and measure the position and orientation of the sensor 10 in a map coordinate system," here the map data contains spatial information regarding objects in the environment and their expected locations)
comparing, for each of a plurality of the spatial locations the predicted values form the map with corresponding actual values of the actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data)
to obtain at least one difference value representing deviation between expected and actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data, this update information is referring to information indicating differences between positions)
how the respective error source contributed to the obtainment of the difference value (Paragraph [0021], “where the feature points used for position and orientation estimation increase or decrease or the positions of the feature points change include: a case where position estimation errors of the feature points exceed a predetermined threshold or such errors accumulate … an impact of the errors varies greatly depending on measurement accuracy of the sensor and performance of the apparatus that performs the estimation processing,” here the system is determining the influence of errors on the position estimation)
determining (I) whether the at least one difference value is indicative of fulfillment of a predefined environmental characterization of an environment surrounding the at least one sensor and (Paragraph [0064], "Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information") (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is determining a occupancy grid indicating if the property, in this case if the location is occupied, is fulfilled/occupied, or not fulfilled/unoccupied)
(ii) a probability of such fulfillment of the predefined environmental characterization (Paragraph [0088], “As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map. In other words, the amount of change in the probability value of a specific grid cell before and after a map update may be used as the difference,” here each grid cell is assigned a probability of occupancy/environmental characterization)
and controlling a movement or operation of the machine based on (a) whether the predefined environmental characterization is determined to be fulfilled and (b) the determined probability of such fulfilment (Paragraph [0064], “Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information,” here the system is outputting an updated occupancy grid) (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is outputting an occupancy grid, indicating a property, occupied or unoccupied, with an associated probability from 0 to 1) (Paragraph [0044], “In other words, the vehicle is moved based on the map information indicating the position and orientation of the vehicle at the target point (or route), and the position and orientation of the sensor 10 estimated by the position and orientation estimation unit 13. To calculate the control values, the control value determination unit 17 calculates all possible variations of the control values that reduce the Euclidean distance between the position and orientation of the vehicle and the position and orientation at the target point in the map information as control value candidates,” here the system is then controlling the vehicle based on the determined updated 3D positions of objects and feature points and the grid cell occupancy map with an associated probability).
However Tomioka does not explicitly teach estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources.
Chandler teaches locate and model a 3D object captured in multiple time-series of sensor data of multiple sensor modalities including
estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value (Paragraph [0074], “determining perception errors by comparing the vehicle detections against the pseudo ground truth,“ here the system includes a determination of a difference value for sensor data vs the ground truth data) (Paragraph [0077], “where the pose of the object is changing in time, and thus a pose vector p.sub.i is determined for each timestep i of the time series corresponding to a captured frame for at least one sensor modality. The values of the shape, size and pose parameters may be adjusted so as to minimise a total error function 500 comprising multiple terms based on the available sensor data as well as shape and motion models,” here the system is determining the contribution of error from a plurality of potential sources, the error function including a plurality of terms/sources, this error function is used in the processing of the sensor data such as the comparison which determines a difference value)
determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error (Paragraph [0094], “A shape regularisation term may be used to enforce consistency of the shape model with some prior knowledge of what the shape of the object should be. For example, in the semantic keypoint refinement mentioned above, the locations of the 3D semantic keypoints within the bounding box defining the object … a shape regularisation term 940 may be based on the probability of the modelled object keypoints under the respective probability distributions, where a less probable position would be penalised more heavily than a position close to the centre of the Gaussian. In general, a shape regularisation term 940 may be used to enforce consistency,” here the system determines a probability of a position and shape of an object based the total error terms) (Paragraph [0065], “In use, the (instance of the) perception component 102 of the autonomous vehicle 200 interprets structure within perception inputs captured by the at least one sensor 202, in real time, in accordance with its training, and the autonomous vehicle controller 204 controls the speed and direction of the vehicle based on the results, with no or limited input from any human driver,” here using the determined total error the system determines the positions and existence of objects/environmental characterization, and using probability of position and shape the vehicle is controlled).
Tomioka and Chandler are analogous art as they are both generally related to systems and methods for processing and evaluating vehicle sensor data.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to include estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources of Chandler in the evaluating spatially resolved sensor data of Tomioka with a reasonable expectation of success in order to increase the accuracy of the perception of the system for aggregating a total error across a plurality of sensor modalities (Paragraph [0029], “to tune both the shape of the 3D object model and the time sequence of poses in a way that minimizes some overall measure of error defined in the cost function. A high level of perception accuracy is achieved by aggregating the overall error across both time and multiple sensor modalities, in a way that incorporated additional knowledge of the object class and the shape characteristics normally associated with the known object class”).
Regarding claim 2, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the potential error sources include a measurement inaccuracy of the at least one sensor (Paragraph [0021], “In the former case, an impact of the errors varies greatly depending on measurement accuracy of the sensor and performance of the apparatus that performs the estimation processing”).
Regarding claim 3, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the potential error sources include a location and orientation inaccuracy associated with the actual sensor data (Paragraph [0047], “a difference indicating the magnitude of change in the position and orientation estimated by the position and orientation estimation unit 13 before and after a map update by using a change between the position and orientation of the sensor 10 in the original map information and the position and orientation of the sensor 10 in the update information, and the update information,” here the system is determining a difference/inaccuracy of the position and orientation of the sensor data after the update is performed).
Regarding claim 4, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the potential error sources include a map data inaccuracy (Paragraph [0021], “where the feature points used for position and orientation estimation increase or decrease or the positions of the feature points change include: a case where position estimation errors of the feature points exceed a predetermined threshold or such errors accumulate, and where the environment changes. ... In the latter case, examples of the environmental change include a change in the layout of the traveling environment, variations due to human or object movement, and a change in lighting conditions. Occurrence of such environmental changes can increase or decrease feature points in the map prepared in advance or move the initially observed feature points,” here error sources can include environment/map changes such as inaccuracies in the layout or moved objects).
Regarding claim 5, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the determination of whether the predefined environmental characterization is fulfilled and the probability takes place based on a comparison of the at least one difference value or a residual value that represents the at least one difference value, together with the estimated contributions of the potential error sources, to a threshold value (Paragraph [0088], “As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map. In other words, the amount of change in the probability value of a specific grid cell before and after a map update may be used as the difference.”) (Paragraph [0048], “If the value indicating the easiness of update is less than a threshold, the map is not updated. In such a case, the update behavior determination unit 130 inputs information indicating an absence of update to the map update unit 16,” here the determination of the environmental characterization is based on the comparison to determine a change which is based on a probability value, this probability value is on a threshold range from 0-1, while Tomioka here is not explicitly teaching the use of a plurality of error sources, this limitation is taught by Chandler).
Regarding claim 6, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein error limits are taken into account when estimating how the error source contributed to obtaining the difference value (Paragraph [0021], “where the feature points used for position and orientation estimation increase or decrease or the positions of the feature points change include: a case where position estimation errors of the feature points exceed a predetermined threshold or such errors accumulate,” here the position errors are compared to a threshold/limit) (Here while Tomioka here is not explicitly teaching the use of a plurality of error sources, this limitation is taught by Chandler).
Regarding claim 7, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the error limits are checked to determine whether an adjustment of the error limits is required (Paragraph [0057], “In particular, the greater the update behavior of the map, the more gently the camera position can be changed by increasing the number of corrections. The weights on the positions and orientations of the key frames included in the key frame group are increased each time the positions and orientations are measured. The greater the update behavior, the smaller the value of the rate of increase. Thus, updating the map more slowly as an effect on the position and orientation estimation is higher reduces the amounts of change in the position measurement values due to an update, and reduces abrupt changes in the speed and direction of the vehicle,” here the system will adjust the amount of change per update as a function of the amount of update, the system will adjust the error limit/change per correction, as a function of the overall size of the required update).
Regarding claim 8, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein an uncertainty volume is considered for the actual sensor data, and wherein the predefined environmental characterization linked to the actual sensor data is assumed for the entire uncertainty volume (Paragraph [0055], “Specifically, the wider the update region, the greater the calculated difference. More specifically, the difference obtaining unit 120 calculates the volume of the foregoing convex hull space of the key frame group to be updated, and calculates the reciprocal of the volume value as the difference value. Alternatively, the difference value may be the reciprocal of the surface area of the convex hull space. The reciprocal of the volume of an ellipse circumscribing or inscribing the convex hull space or the reciprocal of the length of the major axis of the ellipse may be used. While the convex hull space is calculated to include the key frame group, the convex hull space may be calculated to include the feature points included in the update information,” here the system is using the calculated difference obtained by comparing the sensor data to the map data in order to calculate a volume of a space/uncertainty volume of the object to be updated).
Regarding claim 10, Tomioka teaches a computing unit of a machine that is equipped with at least one sensor, the computing unit comprising a processor system that includes at least one processor, wherein the processor system is configured to (Paragraph [0003], "an information processing apparatus includes an obtaining unit configured to obtain a second position of a map feature point included in map information based on the map information and information obtained from a sensor mounted on a vehicle") (See Figure 8, showing an information processing apparatus/computing unit)
read actual sensor data recorded by the at least one sensor (See Figure 2, item S12 "Receive input of sensor information")
ascertain a location and an orientation associated with the actual sensor data (See Figure 2, item S13 "Estimate a position and orientation")
read a spatially resolved map comprising, for respective spatial locations, predicted values that characterize raw sensor data expected to be returned by the at least one sensor at the respective location (Paragraph [0027], "The map information according to the present exemplary embodiment includes key frame group information including one or a plurality of pieces of key frame information indicating a specific object in a real space. Using the map information, the position and orientation estimation unit 13 can compare features of feature points obtained from the sensor information with features of feature points measured in advance, and measure the position and orientation of the sensor 10 in a map coordinate system," here the map data contains spatial information regarding objects in the environment and their expected locations)
compare, for each of a plurality of the spatial locations, the predicted values from the map with corresponding actual values of the actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data)
to obtain at least one difference value representing deviation between expected and actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data, this update information is referring to information indicating differences between positions)
how the respective error source contributed to the obtainment of the difference value (Paragraph [0021], “where the feature points used for position and orientation estimation increase or decrease or the positions of the feature points change include: a case where position estimation errors of the feature points exceed a predetermined threshold or such errors accumulate … an impact of the errors varies greatly depending on measurement accuracy of the sensor and performance of the apparatus that performs the estimation processing,” here the system is determining the influence of errors on the position estimation)
determine: (I) whether the at least one difference value is indicative of fulfillment of a predefined environmental characterization of an environment surrounding the at least one sensor and (Paragraph [0064], "Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information") (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is determining a occupancy grid indicating if the property, in this case if the location is occupied, is fulfilled/occupied, or not fulfilled/unoccupied)
(II) a probability of such fulfillment of the predefined environmental characterization and (Paragraph [0088], “As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map. In other words, the amount of change in the probability value of a specific grid cell before and after a map update may be used as the difference,” here each grid cell is assigned a probability of occupancy/environmental characterization)
control a movement or operation of the machine based on (a) whether the predefined environmental characterization is determined to be fulfilled and (b) the determined probability of such fulfillment (Paragraph [0064], “Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information,” here the system is outputting an updated occupancy grid) (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is outputting an occupancy grid, indicating a property, occupied or unoccupied, with an associated probability from 0 to 1) (Paragraph [0044], “In other words, the vehicle is moved based on the map information indicating the position and orientation of the vehicle at the target point (or route), and the position and orientation of the sensor 10 estimated by the position and orientation estimation unit 13. To calculate the control values, the control value determination unit 17 calculates all possible variations of the control values that reduce the Euclidean distance between the position and orientation of the vehicle and the position and orientation at the target point in the map information as control value candidates,” here the system is then controlling the vehicle based on the determined updated 3D positions of objects and feature points and the grid cell occupancy map with an associated probability).
However Tomioka does not explicitly teach estimate, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determine, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources.
Chandler teaches locate and model a 3D object captured in multiple time-series of sensor data of multiple sensor modalities including
estimate, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value (Paragraph [0074], “determining perception errors by comparing the vehicle detections against the pseudo ground truth,“ here the system includes a determination of a difference value for sensor data vs the ground truth data) (Paragraph [0077], “where the pose of the object is changing in time, and thus a pose vector p.sub.i is determined for each timestep i of the time series corresponding to a captured frame for at least one sensor modality. The values of the shape, size and pose parameters may be adjusted so as to minimise a total error function 500 comprising multiple terms based on the available sensor data as well as shape and motion models,” here the system is determining the contribution of error from a plurality of potential sources, the error function including a plurality of terms/sources, this error function is used in the processing of the sensor data such as the comparison which determines a difference value)
determine, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources (Paragraph [0094], “A shape regularisation term may be used to enforce consistency of the shape model with some prior knowledge of what the shape of the object should be. For example, in the semantic keypoint refinement mentioned above, the locations of the 3D semantic keypoints within the bounding box defining the object … a shape regularisation term 940 may be based on the probability of the modelled object keypoints under the respective probability distributions, where a less probable position would be penalised more heavily than a position close to the centre of the Gaussian. In general, a shape regularisation term 940 may be used to enforce consistency,” here the system determines a probability of a position and shape of an object based the total error terms) (Paragraph [0065], “In use, the (instance of the) perception component 102 of the autonomous vehicle 200 interprets structure within perception inputs captured by the at least one sensor 202, in real time, in accordance with its training, and the autonomous vehicle controller 204 controls the speed and direction of the vehicle based on the results, with no or limited input from any human driver,” here using the determined total error the system determines the positions and existence of objects/environmental characterization, and using probability of position and shape the vehicle is controlled).
Tomioka and Chandler are analogous art as they are both generally related to systems and methods for processing and evaluating vehicle sensor data.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to include estimate, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determine, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources of Chandler in the evaluating spatially resolved sensor data of Tomioka with a reasonable expectation of success in order to increase the accuracy of the perception of the system for aggregating a total error across a plurality of sensor modalities (Paragraph [0029], “to tune both the shape of the 3D object model and the time sequence of poses in a way that minimizes some overall measure of error defined in the cost function. A high level of perception accuracy is achieved by aggregating the overall error across both time and multiple sensor modalities, in a way that incorporated additional knowledge of the object class and the shape characteristics normally associated with the known object class”).
Regarding claim 11, Tomioka teaches a non-transitory machine-readable data carrier on which is stored a computer program for a machine equipped with at least one sensor, the computer program being executable by one or more computers and, when executed by the one or more computers, causing the one or more computers to perform the following steps (Paragraph [0195], “Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’)”) (Paragraph [0003], "an information processing apparatus includes an obtaining unit configured to obtain a second position of a map feature point included in map information based on the map information and information obtained from a sensor mounted on a vehicle")
reading actual sensor data recorded by the at least one sensor (See Figure 2, item S12 "Receive input of sensor information")
ascertaining a location and an orientation associated with the actual sensor data (See Figure 2, item S13 "Estimate a position and orientation")
reading a spatially resolved map comprising, for respective spatial locations, predicted values that characterize raw sensor data expected to be returned by the at least one sensor at the respective location "The map information according to the present exemplary embodiment includes key frame group information including one or a plurality of pieces of key frame information indicating a specific object in a real space. Using the map information, the position and orientation estimation unit 13 can compare features of feature points obtained from the sensor information with features of feature points measured in advance, and measure the position and orientation of the sensor 10 in a map coordinate system," here the map data contains spatial information regarding objects in the environment and their expected locations)
comparing, for each of a plurality of the spatial locations the predicted values form the map with corresponding actual values of the actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data)
to obtain at least one difference value representing deviation between expected and actual sensor data (Paragraph [0030], "The update information generation unit 15 generates update information for updating the map (map information) based on the positions of the feature points included in the map information stored in the map storage unit 14 and new positions of the feature points obtained from the sensor information. Here, the update information refers to information indicating differences between the positions of the feature points in the map information and the positions of the feature points obtained from the sensor information when the feature points included in the map information are observed by the sensor 10," here the system is determining update information based on comparing expected feature points in the map information and feature points in the sensor data, this update information is referring to information indicating differences between positions)
how the respective error source contributed to the obtainment of the difference value (Paragraph [0021], “where the feature points used for position and orientation estimation increase or decrease or the positions of the feature points change include: a case where position estimation errors of the feature points exceed a predetermined threshold or such errors accumulate … an impact of the errors varies greatly depending on measurement accuracy of the sensor and performance of the apparatus that performs the estimation processing,” here the system is determining the influence of errors on the position estimation)
determining (I) whether the at least one difference value is indicative of fulfillment of a predefined environmental characterization of an environment surrounding the at least one sensor and (Paragraph [0064], "Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information") (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is determining a occupancy grid indicating if the property, in this case if the location is occupied, is fulfilled/occupied, or not fulfilled/unoccupied)
(ii) a probability of such fulfillment of the predefined environmental characterization (Paragraph [0088], “As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map. In other words, the amount of change in the probability value of a specific grid cell before and after a map update may be used as the difference,” here each grid cell is assigned a probability of occupancy/environmental characterization)
and controlling a movement or operation of the machine based on (a) whether the predefined environmental characterization is determined to be fulfilled and (b) the determined probability of such fulfilment (Paragraph [0064], “Specifically, the values of only the IDs for identifying the key frames, the updated positions and orientations, and the updated 3D positions of the feature points may be stored as the update information. If the occupancy grid map described in the foregoing modification is used as the map information, the coordinates of the grid cells and the rewrite values may be stored as the update information,” here the system is outputting an updated occupancy grid) (Paragraph [0088], "As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map," here the system is outputting an occupancy grid, indicating a property, occupied or unoccupied, with an associated probability from 0 to 1) (Paragraph [0044], “In other words, the vehicle is moved based on the map information indicating the position and orientation of the vehicle at the target point (or route), and the position and orientation of the sensor 10 estimated by the position and orientation estimation unit 13. To calculate the control values, the control value determination unit 17 calculates all possible variations of the control values that reduce the Euclidean distance between the position and orientation of the vehicle and the position and orientation at the target point in the map information as control value candidates,” here the system is then controlling the vehicle based on the determined updated 3D positions of objects and feature points and the grid cell occupancy map with an associated probability).
However Tomioka does not explicitly teach estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources.
Chandler teaches locate and model a 3D object captured in multiple time-series of sensor data of multiple sensor modalities including
estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value (Paragraph [0074], “determining perception errors by comparing the vehicle detections against the pseudo ground truth,“ here the system includes a determination of a difference value for sensor data vs the ground truth data) (Paragraph [0077], “where the pose of the object is changing in time, and thus a pose vector p.sub.i is determined for each timestep i of the time series corresponding to a captured frame for at least one sensor modality. The values of the shape, size and pose parameters may be adjusted so as to minimise a total error function 500 comprising multiple terms based on the available sensor data as well as shape and motion models,” here the system is determining the contribution of error from a plurality of potential sources, the error function including a plurality of terms/sources, this error function is used in the processing of the sensor data such as the comparison which determines a difference value)
determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error (Paragraph [0094], “A shape regularisation term may be used to enforce consistency of the shape model with some prior knowledge of what the shape of the object should be. For example, in the semantic keypoint refinement mentioned above, the locations of the 3D semantic keypoints within the bounding box defining the object … a shape regularisation term 940 may be based on the probability of the modelled object keypoints under the respective probability distributions, where a less probable position would be penalised more heavily than a position close to the centre of the Gaussian. In general, a shape regularisation term 940 may be used to enforce consistency,” here the system determines a probability of a position and shape of an object based the total error terms) (Paragraph [0065], “In use, the (instance of the) perception component 102 of the autonomous vehicle 200 interprets structure within perception inputs captured by the at least one sensor 202, in real time, in accordance with its training, and the autonomous vehicle controller 204 controls the speed and direction of the vehicle based on the results, with no or limited input from any human driver,” here using the determined total error the system determines the positions and existence of objects/environmental characterization, and using probability of position and shape the vehicle is controlled).
Tomioka and Chandler are analogous art as they are both generally related to systems and methods for processing and evaluating vehicle sensor data.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to include estimating, for each of a plurality of potential error sources, how the respective error source contributed to the obtainment of the difference value, determining, based on both (i) the at least one difference value and (ii) the plurality of estimated contributions of the potential error sources of Chandler in the evaluating spatially resolved sensor data of Tomioka with a reasonable expectation of success in order to increase the accuracy of the perception of the system for aggregating a total error across a plurality of sensor modalities (Paragraph [0029], “to tune both the shape of the 3D object model and the time sequence of poses in a way that minimizes some overall measure of error defined in the cost function. A high level of perception accuracy is achieved by aggregating the overall error across both time and multiple sensor modalities, in a way that incorporated additional knowledge of the object class and the shape characteristics normally associated with the known object class”).
Regarding claim 12, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, Tomioka further teaches wherein the predefined environmental characterization is a characterization of whether the environment is drivable (Paragraph [0018], “A first exemplary embodiment of the present disclosure can reduce abrupt changes occurring in the estimation result of the position and orientation of an object estimating its position by using map information when the map information is updated.”) (Paragraph [0088], “Furthermore, the amount of change in the value of each grid cell of the occupancy grid map due to a map update may be used as the difference. As employed herein, the value of each grid cell refers to the probability value of 0 to 1 representing whether there is an object in the occupancy grid map. In other words, the amount of change in the probability value of a specific grid cell before and after a map update may be used as the difference,” here the system is directed towards evaluating sensor data to determine accurate existence and position of objects, an object indicating an environmental characterization that would interfere with the drivability of the vehicle, if the occupancy grid indicates an object the space is not drivable as the vehicle would collide with the object) (Examiners Note: here Tomioka is determining a drivable area via the detection of obstacles, Chandler also teaches the limitation of determining a drivable area such as in Paragraph [0061]).
Regarding claim 13, the combination of Tomioka and Chandler teaches the method as discussed above in claim 1, however while Tomioka further teaches during real-time operation of the machine, which includes continuous real-time determination of a state of the environment based on the actual sensor data (Paragraph [0071], “The information processing apparatus 1 can be applied to any configuration that calculates a position and orientation by using an updated map. For example, the vehicle may be a mobile robot, an automatic guided vehicle (AGV), autonomous mobile robot (AMR), unmanned ground vehicle (UGV), autonomous underwater vehicle (AUV), an unmanned carrier, a self-driving vehicle, or an unmanned aerial vehicle (UAV) such as drone. The movement control described in the present exemplary embodiment may be applied thereto. The movement control described in the present exemplary embodiment may also be applied to vehicles that fly and move in the air, vehicles that move on the water, and vehicles that submerge and move in the water, aside from vehicles walking or running on the ground.”) (Paragraph [0003], “the update unit restrains an update of the map information, and wherein, in a case where the vehicle traveling near the vehicle first position continues traveling along the closed route, the update unit updates the map information between when the vehicle gets a predetermined distance or more away from near the vehicle first position and when the vehicle returns to a predetermined distance or less away from the vehicle second position”).
Tomioka does not explicitly teach wherein the potential error sources comprise a measurement uncertainty of the at least one sensor, a pose uncertainty associated with the ascertained location and orientation, and a map inaccuracy associated with the spatially resolved map and during real-time operation of the machine, which includes continuous real-time determination of a state of the environment based on the actual sensor data, respective abilities of the respective potential error sources to contribute to the actual sensor data deviating from the predicted values of the expected sensor data are continuously evaluated to account for respective changes in the respective abilities occurring over time, and the continuously evaluated abilities are used for the estimating step.
Chandler further teaches wherein the potential error sources comprise a measurement uncertainty of the at least one sensor, a pose uncertainty associated with the ascertained location and orientation, and a map inaccuracy associated with the spatially resolved map (Paragraph [0087], “However, in this case each timestep i corresponds with a time instant at which an individual lidar measurement occurred and a lidar error is computed for each measurement before aggregating over the full time series,” here the system accounts for a measurement uncertainty/error of at least one sensor/LIDAR) (Paragraph [0077], “The values of the shape, size and pose parameters may be adjusted so as to minimise a total error function 500 comprising multiple terms based on the available sensor data as well as shape and motion models,” here the system includes a pose error in the total error function) (Paragraph [0106], “A ‘depth’ error term 514, denoted E.sub.depth may be defined where other 3D data is available for the given image, for example a stereoscopic depth map … a depth error term may penalise deviations between the 3D depth information from the given sensor modality and the expected depth of the object based on the current estimate of the object shape and pose,” here the system includes an inaccuracy associated with other 3D data such as depth map information)
and during real-time operation of the machine, which includes continuous real-time determination of a state of the environment based on the actual sensor data, respective abilities of the respective potential error sources to contribute to the actual sensor data deviating from the predicted values of the expected sensor data are continuously evaluated to account for respective changes in the respective abilities occurring over time, and the continuously evaluated abilities are used for the estimating step (Paragraph [0065], “In use, the (instance of the) perception component 102 of the autonomous vehicle 200 interprets structure within perception inputs captured by the at least one sensor 202, in real time, in accordance with its training, and the autonomous vehicle controller 204 controls the speed and direction of the vehicle based on the results, with no or limited input from any human driver,” here the perception operations to determine a state of the environment and evaluation of errors are being performed in real time in order to autonomously control the vehicle).
Tomioka and Chandler are analogous art as they are both generally related to systems and methods for processing and evaluating vehicle sensor data.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the instant application to include wherein the potential error sources comprise a measurement uncertainty of the at least one sensor, a pose uncertainty associated with the ascertained location and orientation, and a map inaccuracy associated with the spatially resolved map and during real-time operation of the machine, which includes continuous real-time determination of a state of the environment based on the actual sensor data, respective abilities of the respective potential error sources to contribute to the actual sensor data deviating from the predicted values of the expected sensor data are continuously evaluated to account for respective changes in the respective abilities occurring over time, and the continuously evaluated abilities are used for the estimating step of Chandler in the evaluating spatially resolved sensor data of Tomioka with a reasonable expectation of success in order to increase the accuracy of the perception of the system for aggregating a total error across a plurality of sensor modalities (Paragraph [0029], “to tune both the shape of the 3D object model and the time sequence of poses in a way that minimizes some overall measure of error defined in the cost function. A high level of perception accuracy is achieved by aggregating the overall error across both time and multiple sensor modalities, in a way that incorporated additional knowledge of the object class and the shape characteristics normally associated with the known object class”).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Astrom (US-10368217) teaches a method for generating a model of an environment enabling positioning within the environment including determining a plurality of error sources. Aghamohammadi (US-20180012370) teaches obtaining sensor measurements and using those measurements to determine a probability distributions. Gustafsson (US-11237005) teaches methods and arrangements for generating and updating maps representing a location using sensor data received from vehicles.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER FEES whose telephone number is (303)297-4343. The examiner can normally be reached Monday-Thursday 7:30 - 5:30 MT.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Aniss Chad can be reached at (571) 270-3832. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTOPHER GEORGE FEES/Examiner, Art Unit 3662